Compare commits

...

60 Commits

Author SHA1 Message Date
Will Miao
0968698804 feat: add v0.9.8 release notes and update version 2025-10-15 19:47:13 +08:00
Will Miao
a5b2e9b0bf feat: add update service dependency and has_update filter
- Pass ModelUpdateService to CheckpointService, EmbeddingService, and LoraService constructors
- Add has_update query parameter filter to model listing handler
- Update BaseModelService to accept optional update_service parameter

These changes enable model update functionality across different model types and provide filtering capability for models with available updates.
2025-10-15 17:25:16 +08:00
pixelpaws
5a6ff444b9 Merge pull request #572 from willmiao/codex/design-ui-for-model-update-notifications
refactor: tighten civitai update endpoints
2025-10-15 16:01:20 +08:00
pixelpaws
3bb240d3c1 fix(updates): avoid caching failed civitai lookups 2025-10-15 16:00:23 +08:00
pixelpaws
ee0d241c75 refactor(routes): limit update endpoints to essentials 2025-10-15 15:37:35 +08:00
Will Miao
321ff72953 feat: remove delay from bulk auto-organize completion 2025-10-15 10:32:59 +08:00
Will Miao
412f1e62a1 feat(i18n): add model name display option and improve localization, fixes #440
- Add new model name display setting with options to show model name or file name
- Implement helper function to determine display name based on user preference
- Update model card footer to use dynamic display name
- Include model name display setting in settings modal and state management
- Remove redundant labels from display density descriptions in multiple locales
- Simplify card info display descriptions by removing duplicate text

The changes provide cleaner UI text and add flexibility for users to choose between displaying model names or file names in card footers.
2025-10-15 10:23:39 +08:00
Will Miao
8901b32a55 Merge branch 'main' of https://github.com/willmiao/ComfyUI-Lora-Manager 2025-10-15 09:19:05 +08:00
Will Miao
8ab6cc72ad feat: add project documentation in IFLOW.md 2025-10-15 09:18:59 +08:00
pixelpaws
52e671638b feat(example-images): add stop control for download panel 2025-10-15 08:46:03 +08:00
Will Miao
a3070f8d82 feat: add rate limit error handling to CivArchive client
- Add RateLimitError import and exception handling in API methods
- Create _make_request wrapper to surface rate limit errors from downloader
- Add test case to verify rate limit error propagation
- Set default provider as "civarchive_api" for rate limit errors

This ensures rate limit errors are properly propagated and handled throughout the CivArchive client, improving error reporting and allowing callers to implement appropriate retry logic.
2025-10-14 21:38:24 +08:00
Will Miao
3fde474583 feat(civitai): add rate limiting support and error handling
- Add RateLimitError import and _make_request wrapper method to handle rate limiting
- Update API methods to use _make_request wrapper instead of direct downloader calls
- Add explicit RateLimitError handling in API methods to properly propagate rate limit errors
- Add _extract_retry_after method to parse Retry-After headers
- Improve error handling by surfacing rate limit information to callers

These changes ensure that rate limiting from the Civitai API is properly detected and handled, allowing callers to implement appropriate backoff strategies when rate limits are encountered.
2025-10-14 21:38:24 +08:00
Will Miao
1454991d6d feat(i18n): update bulk action labels to reflect selected items
Change bulk action labels from "All" to "Selected" in both English and Chinese locales to accurately reflect that these actions apply only to selected items rather than all items. This improves user interface clarity and prevents potential confusion about the scope of bulk operations.
2025-10-14 21:36:11 +08:00
Will Miao
4398851bb9 feat: reorder and update context menu items
- Remove 'clear' action from context menu
- Reorder menu items to prioritize common operations
- Move destructive operations (delete, move) to bottom section
- Add visual separation between action groups
- Maintain all existing functionality with improved organization
2025-10-14 21:29:09 +08:00
Will Miao
5173aa6c20 feat(model-scanner): add logging for file processing, fixes #566 2025-10-14 19:44:59 +08:00
Will Miao
3d98572a62 feat: improve civitai data handling and type safety, fixes #565
- Replace setdefault with get and explicit dict initialization in MetadataUpdater
- Change civitai field type from Optional[Dict] to Dict[str, Any] with default_factory
- Add None check and dict initialization in BaseModelMetadata.__post_init__
- Ensures civitai data is always a dictionary, preventing type errors and improving code reliability
2025-10-14 16:03:33 +08:00
Will Miao
c48095d9c6 feat: replace IO type imports with string literals
Remove direct imports of IO type constants from comfy.comfy_types and replace them with string literals "STRING" in input type definitions and return types. This improves code portability and reduces dependency on external type definitions.

Changes made across multiple files:
- Remove `from comfy.comfy_types import IO` imports
- Replace `IO.STRING` with "STRING" in INPUT_TYPES and RETURN_TYPES
- Move CLIPTextEncode import to function scope in prompt.py for better dependency management

This refactor maintains the same functionality while making the code more self-contained and reducing external dependencies.
2025-10-14 09:12:55 +08:00
Will Miao
1e4d1b8f15 feat(nodes): add Promp (LoraManager) node and autocomplete support 2025-10-13 23:23:32 +08:00
pixelpaws
8c037465ba Merge pull request #564 from willmiao/codex/design-apis-for-pause-and-resume-download
test: add coverage for download pause and resume controls
2025-10-13 21:39:47 +08:00
pixelpaws
055c1ca0d4 test(downloads): cover pause and resume flows 2025-10-13 21:30:23 +08:00
Will Miao
27370df93a feat(download): add support to download models from civarchive, fixes #381 2025-10-13 19:27:56 +08:00
Will Miao
60d23aa238 feat(download): enhance download progress ui with transfer stats 2025-10-13 19:06:51 +08:00
pixelpaws
5e441d9c4f Merge pull request #563 from willmiao/codex/add-download-speed-info-to-progress
feat(downloads): expose throughput metrics in progress APIs
2025-10-13 18:11:32 +08:00
pixelpaws
eb76468280 feat(downloads): expose throughput metrics in progress APIs 2025-10-13 14:39:31 +08:00
Will Miao
01bbaa31a8 fix(ModelTags): fix performance and UX issues in ModelTags 2025-10-12 22:31:10 +08:00
Will Miao
bddf023dc4 Merge branch 'main' of https://github.com/willmiao/ComfyUI-Lora-Manager 2025-10-12 17:43:31 +08:00
Will Miao
8e69a247ed feat(i18n): Update priority tags translations for better localization 2025-10-12 17:43:26 +08:00
pixelpaws
97141b01e1 Merge pull request #562 from willmiao/incremental-cache, see #561
feat(model-scanner): add metadata tracking and improve cache management
2025-10-12 17:09:19 +08:00
Will Miao
acf610ddff feat(model-scanner): add metadata tracking and improve cache management
- Add metadata_source field to track origin of model metadata
- Define MODEL_COLUMNS constants for consistent column management
- Refactor SQL queries to use dynamic column selection
- Improve Civitai data detection to include creator_username and trained_words
- Update database operations to handle new metadata field and tag management
2025-10-12 16:54:39 +08:00
Will Miao
a9a6f66035 feat(api): enhance model API creation with validation and default fallback
Refactored `createModelApiClient` to pass the specific model type as a parameter to each client constructor. Introduced `isValidModelType` for validation and added logic to set a default model type if provided type is invalid or not specified. Updated `getModelApiClient` function to utilize these improvements, ensuring robust model API instantiation.
2025-10-12 15:28:30 +08:00
Will Miao
0040863a03 feat(tests): introduce ROUTE_CALLS_KEY for organizing route calls
Addressed the aiohttp warnings by aligning the test scaffolding with current best practices. Added an AppKey constant and stored the route tracking list through it to satisfy aiohttp’s NotAppKeyWarning expectations. Swapped the websocket lambdas for async no-op handlers so the registered routes now point to coroutine callables, clearing the deprecation warning about bare functions.
2025-10-12 09:12:57 +08:00
Will Miao
4ab86b4ae2 feat(locale): add drag drop error message in locales 2025-10-12 09:07:36 +08:00
Will Miao
b32b4b4042 feat: enhance model scanning to include creator username
Updated the `ModelScanner` class to extract and format the creator username from Civitai data. This enhancement ensures that the creator information is properly included in slim model data.
2025-10-12 08:51:42 +08:00
Will Miao
4e552dcf3e feat: Add drag-and-drop support with visual feedback for sidebar nodes
This commit implements drag-and-drop functionality for sidebar nodes,
adding visual feedback via highlight styling when dragging over
valid drop targets. The CSS introduces new classes to style
drop indicators using the lora-accent color scheme, while the JS
adds event handlers to manage drag operations and update the UI
state accordingly. This improves user interaction by providing
clear visual cues for valid drop areas during file operations.
2025-10-12 06:55:01 +08:00
Will Miao
8f4c02efdc Merge branch 'main' of https://github.com/willmiao/ComfyUI-Lora-Manager 2025-10-12 05:44:07 +08:00
Will Miao
b77c596f3a Fix error message to improve clarity in DownloadManager 2025-10-12 05:43:59 +08:00
pixelpaws
181f0b5626 Merge pull request #560 from willmiao/codex/analyze-coverage-for-backend-tests
test: add standalone bootstrap and model factory coverage
2025-10-11 22:48:35 +08:00
pixelpaws
480e5d966f test: add standalone bootstrap and model factory coverage 2025-10-11 22:42:24 +08:00
Will Miao
e8636b949d feat(ModelTags): Implemented drag-and-drop reordering for tag edit mode so users can rearrange tags directly in the UI, fixes #414 2025-10-11 20:56:22 +08:00
pixelpaws
8ea369db47 Merge pull request #559 from willmiao/codex/add-info-level-logging-in-fetch_and_update_model
feat: log metadata channel on metadata fetch
2025-10-11 20:37:08 +08:00
Will Miao
ec9b37eb53 feat: add model type context to tag suggestions
- Pass modelType parameter to setupTagEditMode function
- Implement model type aware priority tag suggestions
- Add model type normalization and resolution logic
- Handle suggestion state reset when model type changes
- Maintain backward compatibility with existing functionality

The changes enable context-aware tag suggestions based on model type, improving tag relevance and user experience when editing tags for different model types.
2025-10-11 20:36:38 +08:00
Will Miao
b0847f6b87 feat(doc): update priority tags configuration guide wiki 2025-10-11 20:08:42 +08:00
pixelpaws
84d10b1f3b feat(metadata): log metadata channel on fetch 2025-10-11 20:07:01 +08:00
Will Miao
4fdc97d062 Merge branch 'main' of https://github.com/willmiao/ComfyUI-Lora-Manager 2025-10-11 19:43:41 +08:00
Will Miao
5fe5e7ea54 feat(ui): enhance settings modal styling and add priority tags tabs
- Rename `.settings-open-location-button` to `.settings-action-link` for better semantic meaning
- Add enhanced hover/focus states with accent colors and border transitions
- Implement tabbed interface for priority tags with LoRA, checkpoint, and embedding sections
- Improve input styling with consistent error states and example code formatting
- Remove deprecated grid layout in favor of tab-based organization
- Add responsive tab navigation with proper focus management and visual feedback
2025-10-11 19:43:22 +08:00
pixelpaws
7be1a2bd65 Merge pull request #557 from willmiao/civarc-api-support
CivArchive API support
2025-10-11 18:10:06 +08:00
pixelpaws
87842385c6 Merge pull request #558 from willmiao/codex/design-custom-priority-tags-format
feat: add customizable priority tags
2025-10-11 17:46:13 +08:00
Will Miao
1dc189eb39 feat(metadata): implement fallback provider strategy for deleted models
Refactor metadata sync service to use a prioritized provider fallback system when handling deleted CivitAI models. The new approach:

1. Attempts civarchive_api provider first for deleted models
2. Falls back to sqlite provider if archive DB is enabled
3. Maintains existing default provider behavior for non-deleted models
4. Tracks provider attempts and errors for better debugging

This improves reliability when fetching metadata for deleted models by trying multiple sources before giving up, and provides clearer error messages based on which providers were attempted.
2025-10-11 17:44:38 +08:00
pixelpaws
6120922204 chore(priority-tags): add newline terminator 2025-10-11 17:38:20 +08:00
Will Miao
ddb30dbb17 Revert "feat(civarchive_client): update get_model_version_info to resolve the real model/version IDs before fetching the target metadata."
This reverts commit c3a66ecf28.
2025-10-11 16:11:17 +08:00
Will Miao
1e8bd88e28 feat(metadata): improve model ID redirect logic and provider ordering
- Fix CivArchive model ID redirect logic to only follow redirects when context points to original model
- Rename CivitaiModelMetadataProvider to CivArchiveModelMetadataProvider for consistency
- Reorder fallback metadata providers to prioritize Civitai API over CivArchive API for better metadata quality
- Remove unused asyncio import and redundant logging from metadata sync service
2025-10-11 16:11:13 +08:00
Will Miao
c3a66ecf28 feat(civarchive_client): update get_model_version_info to resolve the real model/version IDs before fetching the target metadata. 2025-10-11 15:07:42 +08:00
Will Miao
1f60160e8b feat(civarchive_client): enhance request handling and context parsing
Introduce `_request_json` method for async JSON requests and improved error handling. Add static methods `_normalize_payload`, `_split_context`, `_ensure_list`, and `_build_model_info` to parse and normalize API responses. These changes improve the robustness of the CivArchiveClient by ensuring consistent data structures and handling potential API response issues gracefully.
2025-10-11 13:07:29 +08:00
Will Miao
7d560bf07a chore: add refs 2025-10-11 12:59:13 +08:00
Will Miao
47da9949d9 feat: update recipe download URL path to include /lm prefix
Update the download URL path in RecipeSharingService to include '/lm' prefix,
aligning with the new API route structure for recipe sharing endpoints.
2025-10-10 21:46:56 +08:00
scruffynerf
68c0a5ba71 Better Civ Archive support (adds API) (#549)
* add CivArchive API

* Oops, missed committing this part when I updated codebase to latest version

* Adjust API for version fetching and solve the broken API (hash gives only files, not models - likely to be fixed but in the meantime...)

* add asyncio import to allow timeout cooldown

---------

Co-authored-by: Scruffy Nerf <Scruffynerf@duck.com>
2025-10-10 20:04:01 +08:00
pixelpaws
1aa81c803b Merge pull request #551 from willmiao/codex/refactor-model-metadata-saving-logic
fix: hydrate cached metadata before persisting updates
2025-10-10 08:56:12 +08:00
pixelpaws
8f5e134d3e fix: skip redundant hydration in metadata sync service 2025-10-10 08:49:54 +08:00
pixelpaws
ef03a2a917 fix(metadata): hydrate cached records before saving 2025-10-10 08:30:51 +08:00
Will Miao
e275968553 feat(civitai): remove debug print statement from rewrite_preview_url function 2025-10-10 08:18:23 +08:00
116 changed files with 7791 additions and 946 deletions

103
IFLOW.md Normal file
View File

@@ -0,0 +1,103 @@
# ComfyUI LoRA Manager - iFlow 上下文
## 项目概述
ComfyUI LoRA Manager 是一个全面的工具集,用于简化 ComfyUI 中 LoRA 模型的组织、下载和应用。它提供了强大的功能,如配方管理、检查点组织和一键工作流集成,使模型操作更快、更流畅、更简单。
该项目是一个 Python 后端与 JavaScript 前端结合的 Web 应用程序,既可以作为 ComfyUI 的自定义节点运行,也可以作为独立应用程序运行。
## 项目结构
```
D:\Workspace\ComfyUI\custom_nodes\ComfyUI-Lora-Manager\
├── py/ # Python 后端代码
│ ├── config.py # 全局配置
│ ├── lora_manager.py # 主入口点
│ ├── controllers/ # 控制器
│ ├── metadata_collector/ # 元数据收集器
│ ├── middleware/ # 中间件
│ ├── nodes/ # ComfyUI 节点
│ ├── recipes/ # 配方相关
│ ├── routes/ # API 路由
│ ├── services/ # 业务逻辑服务
│ ├── utils/ # 工具函数
│ └── validators/ # 验证器
├── static/ # 静态资源 (CSS, JS, 图片)
├── templates/ # HTML 模板
├── locales/ # 国际化文件
├── tests/ # 测试代码
├── standalone.py # 独立模式入口
├── requirements.txt # Python 依赖
├── package.json # Node.js 依赖和脚本
└── README.md # 项目说明
```
## 核心组件
### 后端 (Python)
- **主入口**: `py/lora_manager.py``standalone.py`
- **配置**: `py/config.py` 管理全局配置和路径
- **路由**: `py/routes/` 目录下包含各种 API 路由
- **服务**: `py/services/` 目录下包含业务逻辑,如模型扫描、下载管理等
- **模型管理**: 使用 `ModelServiceFactory` 来管理不同类型的模型 (LoRA, Checkpoint, Embedding)
### 前端 (JavaScript)
- **构建工具**: 使用 Node.js 和 npm 进行依赖管理和测试
- **测试**: 使用 Vitest 进行前端测试
## 构建和运行
### 安装依赖
```bash
# Python 依赖
pip install -r requirements.txt
# Node.js 依赖 (用于测试)
npm install
```
### 运行 (ComfyUI 模式)
作为 ComfyUI 的自定义节点安装后,在 ComfyUI 中启动即可。
### 运行 (独立模式)
```bash
# 使用默认配置运行
python standalone.py
# 指定主机和端口
python standalone.py --host 127.0.0.1 --port 9000
```
### 测试
#### 后端测试
```bash
# 安装开发依赖
pip install -r requirements-dev.txt
# 运行测试
pytest
```
#### 前端测试
```bash
# 运行测试
npm run test
# 运行测试并生成覆盖率报告
npm run test:coverage
```
## 开发约定
- **代码风格**: Python 代码应遵循 PEP 8 规范
- **测试**: 新功能应包含相应的单元测试
- **配置**: 使用 `settings.json` 文件进行用户配置
- **日志**: 使用 Python 标准库 `logging` 模块进行日志记录

View File

@@ -34,15 +34,16 @@ Enhance your Civitai browsing experience with our companion browser extension! S
## Release Notes ## Release Notes
### v0.9.6 ### v0.9.8
* **Critical Performance Optimization** - Introduced persistent model cache that dramatically accelerates initialization after startup and significantly reduces Python backend memory footprint for improved application performance. * **Full CivArchive API Support** - Added complete support for the CivArchive API as a fallback metadata source beyond Civitai API. Models deleted from Civitai can now still retrieve metadata through the CivArchive API.
* **Cross-Browser Settings Synchronization** - Migrated nearly all settings to the backend, ensuring your preferences sync automatically across all browsers for a seamless multi-browser experience. * **Download Models from CivArchive** - Added support for downloading models directly from CivArchive, similar to downloading from Civitai. Simply click the Download button and paste the model URL to download the corresponding model.
* **Protected User Settings Location** - Relocated user settings (settings.json) to the user config directory (accessible via the link icon in Settings), preventing accidental deletion during reinstalls or updates. * **Custom Priority Tags** - Introduced Custom Priority Tags feature, allowing users to define custom priority tags. These tags will appear as suggestions when editing tags or during auto organization/download using default paths, providing more precise and controlled folder organization. [Guide](https://github.com/willmiao/ComfyUI-Lora-Manager/wiki/Priority-Tags-Configuration-Guide)
* **Global Context Menu** - Added a new global context menu accessible by right-clicking on empty page areas, providing quick access to global operations with more features coming in future updates. * **Drag and Drop Tag Reordering** - Added drag and drop functionality to reorder tags in the tags edit mode for improved usability.
* **Multi-Library Support** - Introduced support for managing multiple libraries, allowing you to easily switch between different model collections (advanced usage, documentation in progress). * **Download Control in Example Images Panel** - Added stop control in the Download Example Images Panel for better download management.
* **Bug Fixes & Stability Improvements** - Various bug fixes and enhancements for improved stability and reliability. * **Prompt (LoraManager) Node with Autocomplete** - Added new Prompt (LoraManager) node with autocomplete feature for adding embeddings.
* **Lora Manager Nodes in Subgraphs** - Lora Manager nodes now support being placed within subgraphs for more flexible workflow organization.
### v0.9.3 ### v0.9.6
* **Metadata Archive Database Support** - Added the ability to download and utilize a metadata archive database, enabling access to metadata for models that have been deleted from CivitAI. * **Metadata Archive Database Support** - Added the ability to download and utilize a metadata archive database, enabling access to metadata for models that have been deleted from CivitAI.
* **App-Level Proxy Settings** - Introduced support for configuring a global proxy within the application, making it easier to use the manager behind network restrictions. * **App-Level Proxy Settings** - Introduced support for configuring a global proxy within the application, making it easier to use the manager behind network restrictions.
* **Bug Fixes** - Various bug fixes for improved stability and reliability. * **Bug Fixes** - Various bug fixes for improved stability and reliability.

View File

@@ -2,6 +2,7 @@ try: # pragma: no cover - import fallback for pytest collection
from .py.lora_manager import LoraManager from .py.lora_manager import LoraManager
from .py.nodes.lora_loader import LoraManagerLoader, LoraManagerTextLoader from .py.nodes.lora_loader import LoraManagerLoader, LoraManagerTextLoader
from .py.nodes.trigger_word_toggle import TriggerWordToggle from .py.nodes.trigger_word_toggle import TriggerWordToggle
from .py.nodes.prompt import PromptLoraManager
from .py.nodes.lora_stacker import LoraStacker from .py.nodes.lora_stacker import LoraStacker
from .py.nodes.save_image import SaveImage from .py.nodes.save_image import SaveImage
from .py.nodes.debug_metadata import DebugMetadata from .py.nodes.debug_metadata import DebugMetadata
@@ -17,6 +18,7 @@ except ImportError: # pragma: no cover - allows running under pytest without pa
if str(package_root) not in sys.path: if str(package_root) not in sys.path:
sys.path.append(str(package_root)) sys.path.append(str(package_root))
PromptLoraManager = importlib.import_module("py.nodes.prompt").PromptLoraManager
LoraManager = importlib.import_module("py.lora_manager").LoraManager LoraManager = importlib.import_module("py.lora_manager").LoraManager
LoraManagerLoader = importlib.import_module("py.nodes.lora_loader").LoraManagerLoader LoraManagerLoader = importlib.import_module("py.nodes.lora_loader").LoraManagerLoader
LoraManagerTextLoader = importlib.import_module("py.nodes.lora_loader").LoraManagerTextLoader LoraManagerTextLoader = importlib.import_module("py.nodes.lora_loader").LoraManagerTextLoader
@@ -29,6 +31,7 @@ except ImportError: # pragma: no cover - allows running under pytest without pa
init_metadata_collector = importlib.import_module("py.metadata_collector").init init_metadata_collector = importlib.import_module("py.metadata_collector").init
NODE_CLASS_MAPPINGS = { NODE_CLASS_MAPPINGS = {
PromptLoraManager.NAME: PromptLoraManager,
LoraManagerLoader.NAME: LoraManagerLoader, LoraManagerLoader.NAME: LoraManagerLoader,
LoraManagerTextLoader.NAME: LoraManagerTextLoader, LoraManagerTextLoader.NAME: LoraManagerTextLoader,
TriggerWordToggle.NAME: TriggerWordToggle, TriggerWordToggle.NAME: TriggerWordToggle,

View File

@@ -0,0 +1,46 @@
# Custom Priority Tag Format Proposal
To support user-defined priority tags with flexible aliasing across different model types, the configuration will be stored as editable strings. The format balances readability with enough structure for parsing on both the backend and frontend.
## Format Overview
- Each model type is declared on its own line: `model_type: entries`.
- Entries are comma-separated and ordered by priority from highest to lowest.
- An entry may be a single canonical tag (e.g., `realistic`) or a canonical tag with aliases.
- Canonical tags define the final folder name that should be used when matching that entry.
- Aliases are enclosed in parentheses and separated by `|` (vertical bar).
- All matching is case-insensitive; stored canonical names preserve the user-specified casing for folder creation and UI suggestions.
### Grammar
```
priority-config := model-config { "\n" model-config }
model-config := model-type ":" entry-list
model-type := <identifier without spaces>
entry-list := entry { "," entry }
entry := canonical [ "(" alias { "|" alias } ")" ]
canonical := <tag text without parentheses or commas>
alias := <tag text without parentheses, commas, or pipes>
```
Examples:
```
lora: celebrity(celeb|celebrity), stylized, character(char)
checkpoint: realistic(realism|realistic), anime(anime-style|toon)
embedding: face, celeb(celebrity|celeb)
```
## Parsing Notes
- Whitespace around separators is ignored to make manual editing more forgiving.
- Duplicate canonical tags within the same model type collapse to a single entry; the first definition wins.
- Aliases map to their canonical tag. When generating folder names, the canonical form is used.
- Tags that do not match any alias or canonical entry fall back to the first tag in the model's tag list, preserving current behavior.
## Usage
- **Backend:** Convert each model type's string into an ordered list of canonical tags with alias sets. During path generation, iterate by priority order and match tags against both canonical names and their aliases.
- **Frontend:** Surface canonical tags as suggestions, optionally displaying aliases in tooltips or secondary text. Input validation should warn about duplicate aliases within the same model type.
This format allows users to customize priority tag handling per model type while keeping editing simple and avoiding proliferation of folder names through alias normalization.

View File

@@ -0,0 +1,71 @@
# Priority Tags Configuration Guide
This guide explains how to tailor the tag priority order that powers folder naming and tag suggestions in the LoRA Manager. You only need to edit the comma-separated list of entries shown in the **Priority Tags** field for each model type.
## 1. Pick the Model Type
In the **Priority Tags** dialog you will find one tab per model type (LoRA, Checkpoint, Embedding). Select the tab you want to update; changes on one tab do not affect the others.
## 2. Edit the Entry List
Inside the textarea you will see a line similar to:
```
character, concept, style(toon|toon_style)
```
This entire line is the **entry list**. Replace it with your own ordered list.
### Entry Rules
Each entry is separated by a comma, in order from highest to lowest priority:
- **Canonical tag only:** `realistic`
- **Canonical tag with aliases:** `character(char|chars)`
Aliases live inside `()` and are separated with `|`. The canonical name is what appears in folder names and UI suggestions when any of the aliases are detected. Matching is case-insensitive.
## Use `{first_tag}` in Path Templates
When your path template contains `{first_tag}`, the app picks a folder name based on your priority list and the models own tags:
- It checks the priority list from top to bottom. If a canonical tag or any of its aliases appear in the model tags, that canonical name becomes the folder name.
- If no priority tags are found but the model has tags, the very first model tag is used.
- If the model has no tags at all, the folder falls back to `no tags`.
### Example
With a template like `/{model_type}/{first_tag}` and the priority entry list `character(char|chars), style(anime|toon)`:
| Model Tags | Folder Name | Why |
| --- | --- | --- |
| `["chars", "female"]` | `character` | `chars` matches the `character` alias, so the canonical wins. |
| `["anime", "portrait"]` | `style` | `anime` hits the `style` entry, so its canonical label is used. |
| `["portrait", "bw"]` | `portrait` | No priority match, so the first model tag is used. |
| `[]` | `no tags` | Nothing to match, so the fallback is applied. |
## 3. Save the Settings
After editing the entry list, press **Enter** to save. Use **Shift+Enter** whenever you need a new line. Clicking outside the field also saves automatically. A success toast confirms the update.
## Examples
| Goal | Entry List |
| --- | --- |
| Prefer people over styles | `character, portraits, style(anime\|toon)` |
| Group sci-fi variants | `sci-fi(scifi\|science_fiction), cyberpunk(cyber\|punk)` |
| Alias shorthand tags | `realistic(real\|realisim), photorealistic(photo_real)` |
## Tips
- Keep canonical names short and meaningful—they become folder names.
- Place the most important categories first; the first match wins.
- Avoid duplicate canonical names within the same list; only the first instance is used.
## Troubleshooting
- **Unexpected folder name?** Check that the canonical name you want is placed before other matches.
- **Alias not working?** Ensure the alias is inside parentheses and separated with `|`, e.g. `character(char|chars)`.
- **Validation error?** Look for missing parentheses or stray commas. Each entry must follow the `canonical(alias|alias)` pattern or just `canonical`.
With these basics you can quickly adapt Priority Tags to match your librarys organization style.

View File

@@ -0,0 +1,26 @@
# Backend Test Coverage Notes
## Pytest Execution
- Command: `python -m pytest`
- Result: All 283 collected tests passed in the current environment.
- Coverage tooling (``pytest-cov``/``coverage``) is unavailable in the offline sandbox, so line-level metrics could not be generated. The earlier attempt to install ``pytest-cov`` failed because the package index cannot be reached from the container.
## High-Priority Gaps to Address
### 1. Standalone server bootstrapping
* **Source:** [`standalone.py`](../../standalone.py)
* **Why it matters:** The standalone entry point wires together the aiohttp application, static asset routes, model-route registration, and configuration validation. None of these behaviours are covered by automated tests, leaving regressions in bootstrapping logic undetected.
* **Suggested coverage:** Add integration-style tests that instantiate `StandaloneServer`/`StandaloneLoraManager` with temporary settings and assert that routes (HTTP + websocket) are registered, configuration warnings fire for missing paths, and the mock ComfyUI shims behave as expected.
### 2. Model service registration factory
* **Source:** [`py/services/model_service_factory.py`](../../py/services/model_service_factory.py)
* **Why it matters:** The factory coordinates which model services and routes the API exposes, including error handling when unknown model types are requested. No current tests verify registration, memoization of route instances, or the logging path on failures.
* **Suggested coverage:** Unit tests that exercise `register_model_type`, `get_route_instance`, error branches in `get_service_class`/`get_route_class`, and `setup_all_routes` when a route setup raises. Use lightweight fakes to confirm the logger is called and state is cleared via `clear_registrations`.
### 3. Server-side i18n helper
* **Source:** [`py/services/server_i18n.py`](../../py/services/server_i18n.py)
* **Why it matters:** Template rendering relies on the `ServerI18nManager` to load locale JSON, perform key lookups, and format parameters. The fallback logic (dot-notation lookup, English fallbacks, placeholder substitution) is untested, so malformed locale files or regressions in placeholder handling would slip through.
* **Suggested coverage:** Tests that load fixture locale dictionaries, assert `set_locale` fallbacks, verify nested key resolution and placeholder substitution, and ensure missing keys return the original identifier.
## Next Steps
Prioritize creating focused unit tests around these modules, then re-run pytest once coverage tooling is available to confirm the new tests close the identified gaps.

View File

@@ -32,7 +32,7 @@
"korean": "한국어", "korean": "한국어",
"french": "Français", "french": "Français",
"spanish": "Español", "spanish": "Español",
"Hebrew": "עברית" "Hebrew": "עברית"
}, },
"fileSize": { "fileSize": {
"zero": "0 Bytes", "zero": "0 Bytes",
@@ -199,6 +199,7 @@
"videoSettings": "Video-Einstellungen", "videoSettings": "Video-Einstellungen",
"layoutSettings": "Layout-Einstellungen", "layoutSettings": "Layout-Einstellungen",
"folderSettings": "Ordner-Einstellungen", "folderSettings": "Ordner-Einstellungen",
"priorityTags": "Prioritäts-Tags",
"downloadPathTemplates": "Download-Pfad-Vorlagen", "downloadPathTemplates": "Download-Pfad-Vorlagen",
"exampleImages": "Beispielbilder", "exampleImages": "Beispielbilder",
"misc": "Verschiedenes", "misc": "Verschiedenes",
@@ -224,9 +225,9 @@
}, },
"displayDensityHelp": "Wählen Sie, wie viele Karten pro Zeile angezeigt werden sollen:", "displayDensityHelp": "Wählen Sie, wie viele Karten pro Zeile angezeigt werden sollen:",
"displayDensityDetails": { "displayDensityDetails": {
"default": "Standard: 5 (1080p), 6 (2K), 8 (4K)", "default": "5 (1080p), 6 (2K), 8 (4K)",
"medium": "Mittel: 6 (1080p), 7 (2K), 9 (4K)", "medium": "6 (1080p), 7 (2K), 9 (4K)",
"compact": "Kompakt: 7 (1080p), 8 (2K), 10 (4K)" "compact": "7 (1080p), 8 (2K), 10 (4K)"
}, },
"displayDensityWarning": "Warnung: Höhere Dichten können bei Systemen mit begrenzten Ressourcen zu Performance-Problemen führen.", "displayDensityWarning": "Warnung: Höhere Dichten können bei Systemen mit begrenzten Ressourcen zu Performance-Problemen führen.",
"cardInfoDisplay": "Karten-Info-Anzeige", "cardInfoDisplay": "Karten-Info-Anzeige",
@@ -236,8 +237,18 @@
}, },
"cardInfoDisplayHelp": "Wählen Sie, wann Modellinformationen und Aktionsschaltflächen angezeigt werden sollen:", "cardInfoDisplayHelp": "Wählen Sie, wann Modellinformationen und Aktionsschaltflächen angezeigt werden sollen:",
"cardInfoDisplayDetails": { "cardInfoDisplayDetails": {
"always": "Immer sichtbar: Kopf- und Fußzeilen sind immer sichtbar", "always": "Kopf- und Fußzeilen sind immer sichtbar",
"hover": "Bei Hover anzeigen: Kopf- und Fußzeilen erscheinen nur beim Darüberfahren mit der Maus" "hover": "Kopf- und Fußzeilen erscheinen nur beim Darüberfahren mit der Maus"
},
"modelNameDisplay": "Anzeige des Modellnamens",
"modelNameDisplayOptions": {
"modelName": "Modellname",
"fileName": "Dateiname"
},
"modelNameDisplayHelp": "Wählen Sie aus, was in der Fußzeile der Modellkarte angezeigt werden soll:",
"modelNameDisplayDetails": {
"modelName": "Den beschreibenden Namen des Modells anzeigen",
"fileName": "Den tatsächlichen Dateinamen auf der Festplatte anzeigen"
} }
}, },
"folderSettings": { "folderSettings": {
@@ -253,6 +264,26 @@
"defaultEmbeddingRootHelp": "Legen Sie den Standard-Embedding-Stammordner für Downloads, Importe und Verschiebungen fest", "defaultEmbeddingRootHelp": "Legen Sie den Standard-Embedding-Stammordner für Downloads, Importe und Verschiebungen fest",
"noDefault": "Kein Standard" "noDefault": "Kein Standard"
}, },
"priorityTags": {
"title": "Prioritäts-Tags",
"description": "Passen Sie die Tag-Prioritätsreihenfolge für jeden Modelltyp an (z. B. character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "Prioritäts-Tags-Hilfe öffnen",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Checkpoint",
"embedding": "Embedding"
},
"saveSuccess": "Prioritäts-Tags aktualisiert.",
"saveError": "Prioritäts-Tags konnten nicht aktualisiert werden.",
"loadingSuggestions": "Lade Vorschläge...",
"validation": {
"missingClosingParen": "Eintrag {index} fehlt eine schließende Klammer.",
"missingCanonical": "Eintrag {index} muss einen kanonischen Tag-Namen enthalten.",
"duplicateCanonical": "Der kanonische Tag \"{tag}\" kommt mehrfach vor.",
"unknown": "Ungültige Prioritäts-Tag-Konfiguration."
}
},
"downloadPathTemplates": { "downloadPathTemplates": {
"title": "Download-Pfad-Vorlagen", "title": "Download-Pfad-Vorlagen",
"help": "Konfigurieren Sie Ordnerstrukturen für verschiedene Modelltypen beim Herunterladen von Civitai.", "help": "Konfigurieren Sie Ordnerstrukturen für verschiedene Modelltypen beim Herunterladen von Civitai.",
@@ -538,7 +569,10 @@
"recursiveOn": "Unterordner durchsuchen", "recursiveOn": "Unterordner durchsuchen",
"recursiveOff": "Nur aktuellen Ordner durchsuchen", "recursiveOff": "Nur aktuellen Ordner durchsuchen",
"recursiveUnavailable": "Rekursive Suche ist nur in der Baumansicht verfügbar", "recursiveUnavailable": "Rekursive Suche ist nur in der Baumansicht verfügbar",
"collapseAllDisabled": "Im Listenmodus nicht verfügbar" "collapseAllDisabled": "Im Listenmodus nicht verfügbar",
"dragDrop": {
"unableToResolveRoot": "Zielpfad für das Verschieben konnte nicht ermittelt werden."
}
}, },
"statistics": { "statistics": {
"title": "Statistiken", "title": "Statistiken",
@@ -613,6 +647,14 @@
"downloadedPreview": "Vorschaubild heruntergeladen", "downloadedPreview": "Vorschaubild heruntergeladen",
"downloadingFile": "{type}-Datei wird heruntergeladen", "downloadingFile": "{type}-Datei wird heruntergeladen",
"finalizing": "Download wird abgeschlossen..." "finalizing": "Download wird abgeschlossen..."
},
"progress": {
"currentFile": "Aktuelle Datei:",
"downloading": "Wird heruntergeladen: {name}",
"transferred": "Heruntergeladen: {downloaded} / {total}",
"transferredSimple": "Heruntergeladen: {downloaded}",
"transferredUnknown": "Heruntergeladen: --",
"speed": "Geschwindigkeit: {speed}"
} }
}, },
"move": { "move": {
@@ -1213,6 +1255,8 @@
"pauseFailed": "Fehler beim Pausieren des Downloads: {error}", "pauseFailed": "Fehler beim Pausieren des Downloads: {error}",
"downloadResumed": "Download fortgesetzt", "downloadResumed": "Download fortgesetzt",
"resumeFailed": "Fehler beim Fortsetzen des Downloads: {error}", "resumeFailed": "Fehler beim Fortsetzen des Downloads: {error}",
"downloadStopped": "Download abgebrochen",
"stopFailed": "Download konnte nicht abgebrochen werden: {error}",
"deleted": "Beispielbild gelöscht", "deleted": "Beispielbild gelöscht",
"deleteFailed": "Fehler beim Löschen des Beispielbilds", "deleteFailed": "Fehler beim Löschen des Beispielbilds",
"setPreviewFailed": "Fehler beim Setzen des Vorschaubilds" "setPreviewFailed": "Fehler beim Setzen des Vorschaubilds"

View File

@@ -199,6 +199,7 @@
"videoSettings": "Video Settings", "videoSettings": "Video Settings",
"layoutSettings": "Layout Settings", "layoutSettings": "Layout Settings",
"folderSettings": "Folder Settings", "folderSettings": "Folder Settings",
"priorityTags": "Priority Tags",
"downloadPathTemplates": "Download Path Templates", "downloadPathTemplates": "Download Path Templates",
"exampleImages": "Example Images", "exampleImages": "Example Images",
"misc": "Misc.", "misc": "Misc.",
@@ -224,9 +225,9 @@
}, },
"displayDensityHelp": "Choose how many cards to display per row:", "displayDensityHelp": "Choose how many cards to display per row:",
"displayDensityDetails": { "displayDensityDetails": {
"default": "Default: 5 (1080p), 6 (2K), 8 (4K)", "default": "5 (1080p), 6 (2K), 8 (4K)",
"medium": "Medium: 6 (1080p), 7 (2K), 9 (4K)", "medium": "6 (1080p), 7 (2K), 9 (4K)",
"compact": "Compact: 7 (1080p), 8 (2K), 10 (4K)" "compact": "7 (1080p), 8 (2K), 10 (4K)"
}, },
"displayDensityWarning": "Warning: Higher densities may cause performance issues on systems with limited resources.", "displayDensityWarning": "Warning: Higher densities may cause performance issues on systems with limited resources.",
"cardInfoDisplay": "Card Info Display", "cardInfoDisplay": "Card Info Display",
@@ -236,8 +237,18 @@
}, },
"cardInfoDisplayHelp": "Choose when to display model information and action buttons:", "cardInfoDisplayHelp": "Choose when to display model information and action buttons:",
"cardInfoDisplayDetails": { "cardInfoDisplayDetails": {
"always": "Always Visible: Headers and footers are always visible", "always": "Headers and footers are always visible",
"hover": "Reveal on Hover: Headers and footers only appear when hovering over a card" "hover": "Headers and footers only appear when hovering over a card"
},
"modelNameDisplay": "Model Name Display",
"modelNameDisplayOptions": {
"modelName": "Model Name",
"fileName": "File Name"
},
"modelNameDisplayHelp": "Choose what to display in the model card footer:",
"modelNameDisplayDetails": {
"modelName": "Display the model's descriptive name",
"fileName": "Display the actual file name on disk"
} }
}, },
"folderSettings": { "folderSettings": {
@@ -253,6 +264,26 @@
"defaultEmbeddingRootHelp": "Set the default embedding root directory for downloads, imports and moves", "defaultEmbeddingRootHelp": "Set the default embedding root directory for downloads, imports and moves",
"noDefault": "No Default" "noDefault": "No Default"
}, },
"priorityTags": {
"title": "Priority Tags",
"description": "Customize the tag priority order for each model type (e.g., character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "Open priority tags help",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Checkpoint",
"embedding": "Embedding"
},
"saveSuccess": "Priority tags updated.",
"saveError": "Failed to update priority tags.",
"loadingSuggestions": "Loading suggestions...",
"validation": {
"missingClosingParen": "Entry {index} is missing a closing parenthesis.",
"missingCanonical": "Entry {index} must include a canonical tag name.",
"duplicateCanonical": "The canonical tag \"{tag}\" appears more than once.",
"unknown": "Invalid priority tag configuration."
}
},
"downloadPathTemplates": { "downloadPathTemplates": {
"title": "Download Path Templates", "title": "Download Path Templates",
"help": "Configure folder structures for different model types when downloading from Civitai.", "help": "Configure folder structures for different model types when downloading from Civitai.",
@@ -391,14 +422,14 @@
"selected": "{count} selected", "selected": "{count} selected",
"selectedSuffix": "selected", "selectedSuffix": "selected",
"viewSelected": "View Selected", "viewSelected": "View Selected",
"addTags": "Add Tags to All", "addTags": "Add Tags to Selected",
"setBaseModel": "Set Base Model for All", "setBaseModel": "Set Base Model for Selected",
"setContentRating": "Set Content Rating for All", "setContentRating": "Set Content Rating for Selected",
"copyAll": "Copy All Syntax", "copyAll": "Copy Selected Syntax",
"refreshAll": "Refresh All Metadata", "refreshAll": "Refresh Selected Metadata",
"moveAll": "Move All to Folder", "moveAll": "Move Selected to Folder",
"autoOrganize": "Auto-Organize Selected", "autoOrganize": "Auto-Organize Selected",
"deleteAll": "Delete All Models", "deleteAll": "Delete Selected Models",
"clear": "Clear Selection", "clear": "Clear Selection",
"autoOrganizeProgress": { "autoOrganizeProgress": {
"initializing": "Initializing auto-organize...", "initializing": "Initializing auto-organize...",
@@ -538,7 +569,10 @@
"recursiveOn": "Search subfolders", "recursiveOn": "Search subfolders",
"recursiveOff": "Search current folder only", "recursiveOff": "Search current folder only",
"recursiveUnavailable": "Recursive search is available in tree view only", "recursiveUnavailable": "Recursive search is available in tree view only",
"collapseAllDisabled": "Not available in list view" "collapseAllDisabled": "Not available in list view",
"dragDrop": {
"unableToResolveRoot": "Unable to determine destination path for move."
}
}, },
"statistics": { "statistics": {
"title": "Statistics", "title": "Statistics",
@@ -613,6 +647,14 @@
"downloadedPreview": "Downloaded preview image", "downloadedPreview": "Downloaded preview image",
"downloadingFile": "Downloading {type} file", "downloadingFile": "Downloading {type} file",
"finalizing": "Finalizing download..." "finalizing": "Finalizing download..."
},
"progress": {
"currentFile": "Current file:",
"downloading": "Downloading: {name}",
"transferred": "Transferred: {downloaded} / {total}",
"transferredSimple": "Transferred: {downloaded}",
"transferredUnknown": "Transferred: --",
"speed": "Speed: {speed}"
} }
}, },
"move": { "move": {
@@ -1213,6 +1255,8 @@
"pauseFailed": "Failed to pause download: {error}", "pauseFailed": "Failed to pause download: {error}",
"downloadResumed": "Download resumed", "downloadResumed": "Download resumed",
"resumeFailed": "Failed to resume download: {error}", "resumeFailed": "Failed to resume download: {error}",
"downloadStopped": "Download cancelled",
"stopFailed": "Failed to cancel download: {error}",
"deleted": "Example image deleted", "deleted": "Example image deleted",
"deleteFailed": "Failed to delete example image", "deleteFailed": "Failed to delete example image",
"setPreviewFailed": "Failed to set preview image" "setPreviewFailed": "Failed to set preview image"

View File

@@ -32,7 +32,7 @@
"korean": "한국어", "korean": "한국어",
"french": "Français", "french": "Français",
"spanish": "Español", "spanish": "Español",
"Hebrew": "עברית" "Hebrew": "עברית"
}, },
"fileSize": { "fileSize": {
"zero": "0 Bytes", "zero": "0 Bytes",
@@ -199,6 +199,7 @@
"videoSettings": "Configuración de video", "videoSettings": "Configuración de video",
"layoutSettings": "Configuración de diseño", "layoutSettings": "Configuración de diseño",
"folderSettings": "Configuración de carpetas", "folderSettings": "Configuración de carpetas",
"priorityTags": "Etiquetas prioritarias",
"downloadPathTemplates": "Plantillas de rutas de descarga", "downloadPathTemplates": "Plantillas de rutas de descarga",
"exampleImages": "Imágenes de ejemplo", "exampleImages": "Imágenes de ejemplo",
"misc": "Varios", "misc": "Varios",
@@ -224,9 +225,9 @@
}, },
"displayDensityHelp": "Elige cuántas tarjetas mostrar por fila:", "displayDensityHelp": "Elige cuántas tarjetas mostrar por fila:",
"displayDensityDetails": { "displayDensityDetails": {
"default": "Predeterminado: 5 (1080p), 6 (2K), 8 (4K)", "default": "5 (1080p), 6 (2K), 8 (4K)",
"medium": "Medio: 6 (1080p), 7 (2K), 9 (4K)", "medium": "6 (1080p), 7 (2K), 9 (4K)",
"compact": "Compacto: 7 (1080p), 8 (2K), 10 (4K)" "compact": "7 (1080p), 8 (2K), 10 (4K)"
}, },
"displayDensityWarning": "Advertencia: Densidades más altas pueden causar problemas de rendimiento en sistemas con recursos limitados.", "displayDensityWarning": "Advertencia: Densidades más altas pueden causar problemas de rendimiento en sistemas con recursos limitados.",
"cardInfoDisplay": "Visualización de información de tarjeta", "cardInfoDisplay": "Visualización de información de tarjeta",
@@ -236,8 +237,18 @@
}, },
"cardInfoDisplayHelp": "Elige cuándo mostrar información del modelo y botones de acción:", "cardInfoDisplayHelp": "Elige cuándo mostrar información del modelo y botones de acción:",
"cardInfoDisplayDetails": { "cardInfoDisplayDetails": {
"always": "Siempre visible: Los encabezados y pies de página siempre son visibles", "always": "Los encabezados y pies de página siempre son visibles",
"hover": "Mostrar al pasar el ratón: Los encabezados y pies de página solo aparecen al pasar el ratón sobre una tarjeta" "hover": "Los encabezados y pies de página solo aparecen al pasar el ratón sobre una tarjeta"
},
"modelNameDisplay": "Visualización del nombre del modelo",
"modelNameDisplayOptions": {
"modelName": "Nombre del modelo",
"fileName": "Nombre del archivo"
},
"modelNameDisplayHelp": "Elige qué mostrar en el pie de la tarjeta del modelo:",
"modelNameDisplayDetails": {
"modelName": "Mostrar el nombre descriptivo del modelo",
"fileName": "Mostrar el nombre real del archivo en el disco"
} }
}, },
"folderSettings": { "folderSettings": {
@@ -253,6 +264,26 @@
"defaultEmbeddingRootHelp": "Establecer el directorio raíz predeterminado de embedding para descargas, importaciones y movimientos", "defaultEmbeddingRootHelp": "Establecer el directorio raíz predeterminado de embedding para descargas, importaciones y movimientos",
"noDefault": "Sin predeterminado" "noDefault": "Sin predeterminado"
}, },
"priorityTags": {
"title": "Etiquetas prioritarias",
"description": "Personaliza el orden de prioridad de etiquetas para cada tipo de modelo (p. ej., character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "Abrir ayuda de etiquetas prioritarias",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Checkpoint",
"embedding": "Embedding"
},
"saveSuccess": "Etiquetas prioritarias actualizadas.",
"saveError": "Error al actualizar las etiquetas prioritarias.",
"loadingSuggestions": "Cargando sugerencias...",
"validation": {
"missingClosingParen": "A la entrada {index} le falta un paréntesis de cierre.",
"missingCanonical": "La entrada {index} debe incluir un nombre de etiqueta canónica.",
"duplicateCanonical": "La etiqueta canónica \"{tag}\" aparece más de una vez.",
"unknown": "Configuración de etiquetas prioritarias no válida."
}
},
"downloadPathTemplates": { "downloadPathTemplates": {
"title": "Plantillas de rutas de descarga", "title": "Plantillas de rutas de descarga",
"help": "Configurar estructuras de carpetas para diferentes tipos de modelos al descargar de Civitai.", "help": "Configurar estructuras de carpetas para diferentes tipos de modelos al descargar de Civitai.",
@@ -538,7 +569,10 @@
"recursiveOn": "Buscar en subcarpetas", "recursiveOn": "Buscar en subcarpetas",
"recursiveOff": "Buscar solo en la carpeta actual", "recursiveOff": "Buscar solo en la carpeta actual",
"recursiveUnavailable": "La búsqueda recursiva solo está disponible en la vista en árbol", "recursiveUnavailable": "La búsqueda recursiva solo está disponible en la vista en árbol",
"collapseAllDisabled": "No disponible en vista de lista" "collapseAllDisabled": "No disponible en vista de lista",
"dragDrop": {
"unableToResolveRoot": "No se puede determinar la ruta de destino para el movimiento."
}
}, },
"statistics": { "statistics": {
"title": "Estadísticas", "title": "Estadísticas",
@@ -613,6 +647,14 @@
"downloadedPreview": "Imagen de vista previa descargada", "downloadedPreview": "Imagen de vista previa descargada",
"downloadingFile": "Descargando archivo de {type}", "downloadingFile": "Descargando archivo de {type}",
"finalizing": "Finalizando descarga..." "finalizing": "Finalizando descarga..."
},
"progress": {
"currentFile": "Archivo actual:",
"downloading": "Descargando: {name}",
"transferred": "Descargado: {downloaded} / {total}",
"transferredSimple": "Descargado: {downloaded}",
"transferredUnknown": "Descargado: --",
"speed": "Velocidad: {speed}"
} }
}, },
"move": { "move": {
@@ -1213,6 +1255,8 @@
"pauseFailed": "Error al pausar descarga: {error}", "pauseFailed": "Error al pausar descarga: {error}",
"downloadResumed": "Descarga reanudada", "downloadResumed": "Descarga reanudada",
"resumeFailed": "Error al reanudar descarga: {error}", "resumeFailed": "Error al reanudar descarga: {error}",
"downloadStopped": "Descarga cancelada",
"stopFailed": "Error al cancelar descarga: {error}",
"deleted": "Imagen de ejemplo eliminada", "deleted": "Imagen de ejemplo eliminada",
"deleteFailed": "Error al eliminar imagen de ejemplo", "deleteFailed": "Error al eliminar imagen de ejemplo",
"setPreviewFailed": "Error al establecer imagen de vista previa" "setPreviewFailed": "Error al establecer imagen de vista previa"

View File

@@ -32,7 +32,7 @@
"korean": "한국어", "korean": "한국어",
"french": "Français", "french": "Français",
"spanish": "Español", "spanish": "Español",
"Hebrew": "עברית" "Hebrew": "עברית"
}, },
"fileSize": { "fileSize": {
"zero": "0 Octets", "zero": "0 Octets",
@@ -203,7 +203,8 @@
"exampleImages": "Images d'exemple", "exampleImages": "Images d'exemple",
"misc": "Divers", "misc": "Divers",
"metadataArchive": "Base de données d'archive des métadonnées", "metadataArchive": "Base de données d'archive des métadonnées",
"proxySettings": "Paramètres du proxy" "proxySettings": "Paramètres du proxy",
"priorityTags": "Étiquettes prioritaires"
}, },
"contentFiltering": { "contentFiltering": {
"blurNsfwContent": "Flouter le contenu NSFW", "blurNsfwContent": "Flouter le contenu NSFW",
@@ -224,9 +225,9 @@
}, },
"displayDensityHelp": "Choisissez combien de cartes afficher par ligne :", "displayDensityHelp": "Choisissez combien de cartes afficher par ligne :",
"displayDensityDetails": { "displayDensityDetails": {
"default": "Par défaut : 5 (1080p), 6 (2K), 8 (4K)", "default": "5 (1080p), 6 (2K), 8 (4K)",
"medium": "Moyen : 6 (1080p), 7 (2K), 9 (4K)", "medium": "6 (1080p), 7 (2K), 9 (4K)",
"compact": "Compact : 7 (1080p), 8 (2K), 10 (4K)" "compact": "7 (1080p), 8 (2K), 10 (4K)"
}, },
"displayDensityWarning": "Attention : Des densités plus élevées peuvent causer des problèmes de performance sur les systèmes avec des ressources limitées.", "displayDensityWarning": "Attention : Des densités plus élevées peuvent causer des problèmes de performance sur les systèmes avec des ressources limitées.",
"cardInfoDisplay": "Affichage des informations de carte", "cardInfoDisplay": "Affichage des informations de carte",
@@ -236,8 +237,18 @@
}, },
"cardInfoDisplayHelp": "Choisissez quand afficher les informations du modèle et les boutons d'action :", "cardInfoDisplayHelp": "Choisissez quand afficher les informations du modèle et les boutons d'action :",
"cardInfoDisplayDetails": { "cardInfoDisplayDetails": {
"always": "Toujours visible : Les en-têtes et pieds de page sont toujours visibles", "always": "Les en-têtes et pieds de page sont toujours visibles",
"hover": "Révéler au survol : Les en-têtes et pieds de page n'apparaissent qu'au survol d'une carte" "hover": "Les en-têtes et pieds de page n'apparaissent qu'au survol d'une carte"
},
"modelNameDisplay": "Affichage du nom du modèle",
"modelNameDisplayOptions": {
"modelName": "Nom du modèle",
"fileName": "Nom du fichier"
},
"modelNameDisplayHelp": "Choisissez ce qui doit être affiché dans le pied de page de la carte du modèle :",
"modelNameDisplayDetails": {
"modelName": "Afficher le nom descriptif du modèle",
"fileName": "Afficher le nom réel du fichier sur le disque"
} }
}, },
"folderSettings": { "folderSettings": {
@@ -345,6 +356,26 @@
"proxyPassword": "Mot de passe (optionnel)", "proxyPassword": "Mot de passe (optionnel)",
"proxyPasswordPlaceholder": "mot_de_passe", "proxyPasswordPlaceholder": "mot_de_passe",
"proxyPasswordHelp": "Mot de passe pour l'authentification proxy (si nécessaire)" "proxyPasswordHelp": "Mot de passe pour l'authentification proxy (si nécessaire)"
},
"priorityTags": {
"title": "Étiquettes prioritaires",
"description": "Personnalisez l'ordre de priorité des étiquettes pour chaque type de modèle (par ex. : character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "Ouvrir l'aide sur les étiquettes prioritaires",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Checkpoint",
"embedding": "Embedding"
},
"saveSuccess": "Étiquettes prioritaires mises à jour.",
"saveError": "Échec de la mise à jour des étiquettes prioritaires.",
"loadingSuggestions": "Chargement des suggestions...",
"validation": {
"missingClosingParen": "L'entrée {index} n'a pas de parenthèse fermante.",
"missingCanonical": "L'entrée {index} doit inclure un nom d'étiquette canonique.",
"duplicateCanonical": "L'étiquette canonique \"{tag}\" apparaît plusieurs fois.",
"unknown": "Configuration d'étiquettes prioritaires invalide."
}
} }
}, },
"loras": { "loras": {
@@ -538,7 +569,10 @@
"recursiveOn": "Rechercher dans les sous-dossiers", "recursiveOn": "Rechercher dans les sous-dossiers",
"recursiveOff": "Rechercher uniquement dans le dossier actuel", "recursiveOff": "Rechercher uniquement dans le dossier actuel",
"recursiveUnavailable": "La recherche récursive n'est disponible qu'en vue arborescente", "recursiveUnavailable": "La recherche récursive n'est disponible qu'en vue arborescente",
"collapseAllDisabled": "Non disponible en vue liste" "collapseAllDisabled": "Non disponible en vue liste",
"dragDrop": {
"unableToResolveRoot": "Impossible de déterminer le chemin de destination pour le déplacement."
}
}, },
"statistics": { "statistics": {
"title": "Statistiques", "title": "Statistiques",
@@ -613,6 +647,14 @@
"downloadedPreview": "Image d'aperçu téléchargée", "downloadedPreview": "Image d'aperçu téléchargée",
"downloadingFile": "Téléchargement du fichier {type}", "downloadingFile": "Téléchargement du fichier {type}",
"finalizing": "Finalisation du téléchargement..." "finalizing": "Finalisation du téléchargement..."
},
"progress": {
"currentFile": "Fichier actuel :",
"downloading": "Téléchargement : {name}",
"transferred": "Téléchargé : {downloaded} / {total}",
"transferredSimple": "Téléchargé : {downloaded}",
"transferredUnknown": "Téléchargé : --",
"speed": "Vitesse : {speed}"
} }
}, },
"move": { "move": {
@@ -1213,6 +1255,8 @@
"pauseFailed": "Échec de la mise en pause du téléchargement : {error}", "pauseFailed": "Échec de la mise en pause du téléchargement : {error}",
"downloadResumed": "Téléchargement repris", "downloadResumed": "Téléchargement repris",
"resumeFailed": "Échec de la reprise du téléchargement : {error}", "resumeFailed": "Échec de la reprise du téléchargement : {error}",
"downloadStopped": "Téléchargement annulé",
"stopFailed": "Échec de l'annulation du téléchargement : {error}",
"deleted": "Image d'exemple supprimée", "deleted": "Image d'exemple supprimée",
"deleteFailed": "Échec de la suppression de l'image d'exemple", "deleteFailed": "Échec de la suppression de l'image d'exemple",
"setPreviewFailed": "Échec de la définition de l'image d'aperçu" "setPreviewFailed": "Échec de la définition de l'image d'aperçu"

View File

@@ -32,7 +32,7 @@
"korean": "한국어", "korean": "한국어",
"french": "Français", "french": "Français",
"spanish": "Español", "spanish": "Español",
"Hebrew": "עברית" "Hebrew": "עברית"
}, },
"fileSize": { "fileSize": {
"zero": "0 בתים", "zero": "0 בתים",
@@ -203,7 +203,8 @@
"exampleImages": "תמונות דוגמה", "exampleImages": "תמונות דוגמה",
"misc": "שונות", "misc": "שונות",
"metadataArchive": "מסד נתונים של ארכיון מטא-דאטה", "metadataArchive": "מסד נתונים של ארכיון מטא-דאטה",
"proxySettings": "הגדרות פרוקסי" "proxySettings": "הגדרות פרוקסי",
"priorityTags": "תגיות עדיפות"
}, },
"contentFiltering": { "contentFiltering": {
"blurNsfwContent": "טשטש תוכן NSFW", "blurNsfwContent": "טשטש תוכן NSFW",
@@ -224,9 +225,9 @@
}, },
"displayDensityHelp": "בחר כמה כרטיסים להציג בכל שורה:", "displayDensityHelp": "בחר כמה כרטיסים להציג בכל שורה:",
"displayDensityDetails": { "displayDensityDetails": {
"default": "ברירת מחדל: 5 (1080p), 6 (2K), 8 (4K)", "default": "5 (1080p), 6 (2K), 8 (4K)",
"medium": "בינוני: 6 (1080p), 7 (2K), 9 (4K)", "medium": "6 (1080p), 7 (2K), 9 (4K)",
"compact": "קומפקטי: 7 (1080p), 8 (2K), 10 (4K)" "compact": "7 (1080p), 8 (2K), 10 (4K)"
}, },
"displayDensityWarning": "אזהרה: צפיפויות גבוהות יותר עלולות לגרום לבעיות ביצועים במערכות עם משאבים מוגבלים.", "displayDensityWarning": "אזהרה: צפיפויות גבוהות יותר עלולות לגרום לבעיות ביצועים במערכות עם משאבים מוגבלים.",
"cardInfoDisplay": "תצוגת מידע בכרטיס", "cardInfoDisplay": "תצוגת מידע בכרטיס",
@@ -236,8 +237,18 @@
}, },
"cardInfoDisplayHelp": "בחר מתי להציג מידע על המודל וכפתורי פעולה:", "cardInfoDisplayHelp": "בחר מתי להציג מידע על המודל וכפתורי פעולה:",
"cardInfoDisplayDetails": { "cardInfoDisplayDetails": {
"always": "תמיד גלוי: כותרות עליונות ותחתונות תמיד גלויות", "always": "כותרות עליונות ותחתונות תמיד גלויות",
"hover": "חשוף בריחוף: כותרות עליונות ותחתונות מופיעות רק בעת ריחוף מעל כרטיס" "hover": "כותרות עליונות ותחתונות מופיעות רק בעת ריחוף מעל כרטיס"
},
"modelNameDisplay": "תצוגת שם מודל",
"modelNameDisplayOptions": {
"modelName": "שם מודל",
"fileName": "שם קובץ"
},
"modelNameDisplayHelp": "בחר מה להציג בכותרת התחתונה של כרטיס המודל:",
"modelNameDisplayDetails": {
"modelName": "הצג את השם התיאורי של המודל",
"fileName": "הצג את שם הקובץ בפועל בדיסק"
} }
}, },
"folderSettings": { "folderSettings": {
@@ -345,6 +356,26 @@
"proxyPassword": "סיסמה (אופציונלי)", "proxyPassword": "סיסמה (אופציונלי)",
"proxyPasswordPlaceholder": "password", "proxyPasswordPlaceholder": "password",
"proxyPasswordHelp": "סיסמה לאימות מול הפרוקסי (אם נדרש)" "proxyPasswordHelp": "סיסמה לאימות מול הפרוקסי (אם נדרש)"
},
"priorityTags": {
"title": "תגיות עדיפות",
"description": "התאם את סדר העדיפות של התגיות עבור כל סוג מודל (לדוגמה: character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "פתח עזרה בנושא תגיות עדיפות",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Checkpoint",
"embedding": "Embedding"
},
"saveSuccess": "תגיות העדיפות עודכנו.",
"saveError": "עדכון תגיות העדיפות נכשל.",
"loadingSuggestions": "טוען הצעות...",
"validation": {
"missingClosingParen": "לרשומה {index} חסר סוגר סוגריים.",
"missingCanonical": "על הרשומה {index} לכלול שם תגית קנונית.",
"duplicateCanonical": "התגית הקנונית \"{tag}\" מופיעה יותר מפעם אחת.",
"unknown": "תצורת תגיות העדיפות שגויה."
}
} }
}, },
"loras": { "loras": {
@@ -538,7 +569,10 @@
"recursiveOn": "חיפוש בתיקיות משנה", "recursiveOn": "חיפוש בתיקיות משנה",
"recursiveOff": "חיפוש רק בתיקייה הנוכחית", "recursiveOff": "חיפוש רק בתיקייה הנוכחית",
"recursiveUnavailable": "חיפוש רקורסיבי זמין רק בתצוגת עץ", "recursiveUnavailable": "חיפוש רקורסיבי זמין רק בתצוגת עץ",
"collapseAllDisabled": "לא זמין בתצוגת רשימה" "collapseAllDisabled": "לא זמין בתצוגת רשימה",
"dragDrop": {
"unableToResolveRoot": "לא ניתן לקבוע את נתיב היעד להעברה."
}
}, },
"statistics": { "statistics": {
"title": "סטטיסטיקה", "title": "סטטיסטיקה",
@@ -613,6 +647,14 @@
"downloadedPreview": "תמונת תצוגה מקדימה הורדה", "downloadedPreview": "תמונת תצוגה מקדימה הורדה",
"downloadingFile": "מוריד קובץ {type}", "downloadingFile": "מוריד קובץ {type}",
"finalizing": "מסיים הורדה..." "finalizing": "מסיים הורדה..."
},
"progress": {
"currentFile": "הקובץ הנוכחי:",
"downloading": "מוריד: {name}",
"transferred": "הורד: {downloaded} / {total}",
"transferredSimple": "הורד: {downloaded}",
"transferredUnknown": "הורד: --",
"speed": "מהירות: {speed}"
} }
}, },
"move": { "move": {
@@ -1213,6 +1255,8 @@
"pauseFailed": "השהיית ההורדה נכשלה: {error}", "pauseFailed": "השהיית ההורדה נכשלה: {error}",
"downloadResumed": "ההורדה חודשה", "downloadResumed": "ההורדה חודשה",
"resumeFailed": "חידוש ההורדה נכשל: {error}", "resumeFailed": "חידוש ההורדה נכשל: {error}",
"downloadStopped": "ההורדה בוטלה",
"stopFailed": "נכשל בביטול ההורדה: {error}",
"deleted": "תמונת הדוגמה נמחקה", "deleted": "תמונת הדוגמה נמחקה",
"deleteFailed": "מחיקת תמונת הדוגמה נכשלה", "deleteFailed": "מחיקת תמונת הדוגמה נכשלה",
"setPreviewFailed": "הגדרת תמונת התצוגה המקדימה נכשלה" "setPreviewFailed": "הגדרת תמונת התצוגה המקדימה נכשלה"

View File

@@ -32,7 +32,7 @@
"korean": "한국어", "korean": "한국어",
"french": "Français", "french": "Français",
"spanish": "Español", "spanish": "Español",
"Hebrew": "עברית" "Hebrew": "עברית"
}, },
"fileSize": { "fileSize": {
"zero": "0バイト", "zero": "0バイト",
@@ -203,7 +203,8 @@
"exampleImages": "例画像", "exampleImages": "例画像",
"misc": "その他", "misc": "その他",
"metadataArchive": "メタデータアーカイブデータベース", "metadataArchive": "メタデータアーカイブデータベース",
"proxySettings": "プロキシ設定" "proxySettings": "プロキシ設定",
"priorityTags": "優先タグ"
}, },
"contentFiltering": { "contentFiltering": {
"blurNsfwContent": "NSFWコンテンツをぼかす", "blurNsfwContent": "NSFWコンテンツをぼかす",
@@ -224,9 +225,9 @@
}, },
"displayDensityHelp": "1行に表示するカード数を選択", "displayDensityHelp": "1行に表示するカード数を選択",
"displayDensityDetails": { "displayDensityDetails": {
"default": "デフォルト:51080p、62K、84K", "default": "51080p、62K、84K",
"medium": "中:61080p、72K、94K", "medium": "61080p、72K、94K",
"compact": "コンパクト:71080p、82K、104K" "compact": "71080p、82K、104K"
}, },
"displayDensityWarning": "警告:高密度設定は、リソースが限られたシステムでパフォーマンスの問題を引き起こす可能性があります。", "displayDensityWarning": "警告:高密度設定は、リソースが限られたシステムでパフォーマンスの問題を引き起こす可能性があります。",
"cardInfoDisplay": "カード情報表示", "cardInfoDisplay": "カード情報表示",
@@ -236,8 +237,18 @@
}, },
"cardInfoDisplayHelp": "モデル情報とアクションボタンの表示タイミングを選択:", "cardInfoDisplayHelp": "モデル情報とアクションボタンの表示タイミングを選択:",
"cardInfoDisplayDetails": { "cardInfoDisplayDetails": {
"always": "常に表示:ヘッダーとフッターが常に表示されます", "always": "ヘッダーとフッターが常に表示されます",
"hover": "ホバー時に表示:カードにホバーしたときのみヘッダーとフッターが表示されます" "hover": "カードにホバーしたときのみヘッダーとフッターが表示されます"
},
"modelNameDisplay": "モデル名表示",
"modelNameDisplayOptions": {
"modelName": "モデル名",
"fileName": "ファイル名"
},
"modelNameDisplayHelp": "モデルカードのフッターに表示する内容を選択:",
"modelNameDisplayDetails": {
"modelName": "モデルの説明的な名前を表示",
"fileName": "ディスク上の実際のファイル名を表示"
} }
}, },
"folderSettings": { "folderSettings": {
@@ -345,6 +356,26 @@
"proxyPassword": "パスワード(任意)", "proxyPassword": "パスワード(任意)",
"proxyPasswordPlaceholder": "パスワード", "proxyPasswordPlaceholder": "パスワード",
"proxyPasswordHelp": "プロキシ認証用のパスワード(必要な場合)" "proxyPasswordHelp": "プロキシ認証用のパスワード(必要な場合)"
},
"priorityTags": {
"title": "優先タグ",
"description": "各モデルタイプのタグ優先順位をカスタマイズします (例: character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "優先タグのヘルプを開く",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "チェックポイント",
"embedding": "埋め込み"
},
"saveSuccess": "優先タグを更新しました。",
"saveError": "優先タグの更新に失敗しました。",
"loadingSuggestions": "候補を読み込み中...",
"validation": {
"missingClosingParen": "エントリ {index} に閉じ括弧がありません。",
"missingCanonical": "エントリ {index} には正規タグ名を含める必要があります。",
"duplicateCanonical": "正規タグ \"{tag}\" が複数回登場しています。",
"unknown": "無効な優先タグ設定です。"
}
} }
}, },
"loras": { "loras": {
@@ -538,7 +569,10 @@
"recursiveOn": "サブフォルダーを検索", "recursiveOn": "サブフォルダーを検索",
"recursiveOff": "現在のフォルダーのみを検索", "recursiveOff": "現在のフォルダーのみを検索",
"recursiveUnavailable": "再帰検索はツリービューでのみ利用できます", "recursiveUnavailable": "再帰検索はツリービューでのみ利用できます",
"collapseAllDisabled": "リストビューでは利用できません" "collapseAllDisabled": "リストビューでは利用できません",
"dragDrop": {
"unableToResolveRoot": "移動先のパスを特定できません。"
}
}, },
"statistics": { "statistics": {
"title": "統計", "title": "統計",
@@ -613,6 +647,14 @@
"downloadedPreview": "プレビュー画像をダウンロードしました", "downloadedPreview": "プレビュー画像をダウンロードしました",
"downloadingFile": "{type}ファイルをダウンロード中", "downloadingFile": "{type}ファイルをダウンロード中",
"finalizing": "ダウンロードを完了中..." "finalizing": "ダウンロードを完了中..."
},
"progress": {
"currentFile": "現在のファイル:",
"downloading": "ダウンロード中: {name}",
"transferred": "ダウンロード済み: {downloaded} / {total}",
"transferredSimple": "ダウンロード済み: {downloaded}",
"transferredUnknown": "ダウンロード済み: --",
"speed": "速度: {speed}"
} }
}, },
"move": { "move": {
@@ -1213,6 +1255,8 @@
"pauseFailed": "ダウンロードの一時停止に失敗しました:{error}", "pauseFailed": "ダウンロードの一時停止に失敗しました:{error}",
"downloadResumed": "ダウンロードが再開されました", "downloadResumed": "ダウンロードが再開されました",
"resumeFailed": "ダウンロードの再開に失敗しました:{error}", "resumeFailed": "ダウンロードの再開に失敗しました:{error}",
"downloadStopped": "ダウンロードをキャンセルしました",
"stopFailed": "ダウンロードのキャンセルに失敗しました:{error}",
"deleted": "例画像が削除されました", "deleted": "例画像が削除されました",
"deleteFailed": "例画像の削除に失敗しました", "deleteFailed": "例画像の削除に失敗しました",
"setPreviewFailed": "プレビュー画像の設定に失敗しました" "setPreviewFailed": "プレビュー画像の設定に失敗しました"

View File

@@ -32,7 +32,7 @@
"korean": "한국어", "korean": "한국어",
"french": "Français", "french": "Français",
"spanish": "Español", "spanish": "Español",
"Hebrew": "עברית" "Hebrew": "עברית"
}, },
"fileSize": { "fileSize": {
"zero": "0 바이트", "zero": "0 바이트",
@@ -203,7 +203,8 @@
"exampleImages": "예시 이미지", "exampleImages": "예시 이미지",
"misc": "기타", "misc": "기타",
"metadataArchive": "메타데이터 아카이브 데이터베이스", "metadataArchive": "메타데이터 아카이브 데이터베이스",
"proxySettings": "프록시 설정" "proxySettings": "프록시 설정",
"priorityTags": "우선순위 태그"
}, },
"contentFiltering": { "contentFiltering": {
"blurNsfwContent": "NSFW 콘텐츠 블러 처리", "blurNsfwContent": "NSFW 콘텐츠 블러 처리",
@@ -224,9 +225,9 @@
}, },
"displayDensityHelp": "한 줄에 표시할 카드 수를 선택하세요:", "displayDensityHelp": "한 줄에 표시할 카드 수를 선택하세요:",
"displayDensityDetails": { "displayDensityDetails": {
"default": "기본: 5개 (1080p), 6개 (2K), 8개 (4K)", "default": "5개 (1080p), 6개 (2K), 8개 (4K)",
"medium": "중간: 6개 (1080p), 7개 (2K), 9개 (4K)", "medium": "6개 (1080p), 7개 (2K), 9개 (4K)",
"compact": "조밀: 7개 (1080p), 8개 (2K), 10개 (4K)" "compact": "7개 (1080p), 8개 (2K), 10개 (4K)"
}, },
"displayDensityWarning": "경고: 높은 밀도는 리소스가 제한된 시스템에서 성능 문제를 일으킬 수 있습니다.", "displayDensityWarning": "경고: 높은 밀도는 리소스가 제한된 시스템에서 성능 문제를 일으킬 수 있습니다.",
"cardInfoDisplay": "카드 정보 표시", "cardInfoDisplay": "카드 정보 표시",
@@ -236,8 +237,18 @@
}, },
"cardInfoDisplayHelp": "모델 정보 및 액션 버튼을 언제 표시할지 선택하세요:", "cardInfoDisplayHelp": "모델 정보 및 액션 버튼을 언제 표시할지 선택하세요:",
"cardInfoDisplayDetails": { "cardInfoDisplayDetails": {
"always": "항상 표시: 헤더와 푸터가 항상 보입니다", "always": "헤더와 푸터가 항상 보입니다",
"hover": "호버 시 표시: 카드에 마우스를 올렸을 때만 헤더와 푸터가 나타납니다" "hover": "카드에 마우스를 올렸을 때만 헤더와 푸터가 나타납니다"
},
"modelNameDisplay": "모델명 표시",
"modelNameDisplayOptions": {
"modelName": "모델명",
"fileName": "파일명"
},
"modelNameDisplayHelp": "모델 카드 하단에 표시할 내용을 선택하세요:",
"modelNameDisplayDetails": {
"modelName": "모델의 설명적 이름 표시",
"fileName": "디스크의 실제 파일명 표시"
} }
}, },
"folderSettings": { "folderSettings": {
@@ -345,6 +356,26 @@
"proxyPassword": "비밀번호 (선택사항)", "proxyPassword": "비밀번호 (선택사항)",
"proxyPasswordPlaceholder": "password", "proxyPasswordPlaceholder": "password",
"proxyPasswordHelp": "프록시 인증에 필요한 비밀번호 (필요한 경우)" "proxyPasswordHelp": "프록시 인증에 필요한 비밀번호 (필요한 경우)"
},
"priorityTags": {
"title": "우선순위 태그",
"description": "모델 유형별 태그 우선순위를 사용자 지정합니다(예: character, concept, style(toon|toon_style)).",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "우선순위 태그 도움말 열기",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "체크포인트",
"embedding": "임베딩"
},
"saveSuccess": "우선순위 태그가 업데이트되었습니다.",
"saveError": "우선순위 태그를 업데이트하지 못했습니다.",
"loadingSuggestions": "추천을 불러오는 중...",
"validation": {
"missingClosingParen": "{index}번째 항목에 닫는 괄호가 없습니다.",
"missingCanonical": "{index}번째 항목에는 정식 태그 이름이 포함되어야 합니다.",
"duplicateCanonical": "정식 태그 \"{tag}\"가 여러 번 나타납니다.",
"unknown": "잘못된 우선순위 태그 구성입니다."
}
} }
}, },
"loras": { "loras": {
@@ -538,7 +569,10 @@
"recursiveOn": "하위 폴더 검색", "recursiveOn": "하위 폴더 검색",
"recursiveOff": "현재 폴더만 검색", "recursiveOff": "현재 폴더만 검색",
"recursiveUnavailable": "재귀 검색은 트리 보기에서만 사용할 수 있습니다", "recursiveUnavailable": "재귀 검색은 트리 보기에서만 사용할 수 있습니다",
"collapseAllDisabled": "목록 보기에서는 사용할 수 없습니다" "collapseAllDisabled": "목록 보기에서는 사용할 수 없습니다",
"dragDrop": {
"unableToResolveRoot": "이동할 대상 경로를 확인할 수 없습니다."
}
}, },
"statistics": { "statistics": {
"title": "통계", "title": "통계",
@@ -613,6 +647,14 @@
"downloadedPreview": "미리보기 이미지 다운로드됨", "downloadedPreview": "미리보기 이미지 다운로드됨",
"downloadingFile": "{type} 파일 다운로드 중", "downloadingFile": "{type} 파일 다운로드 중",
"finalizing": "다운로드 완료 중..." "finalizing": "다운로드 완료 중..."
},
"progress": {
"currentFile": "현재 파일:",
"downloading": "다운로드 중: {name}",
"transferred": "다운로드됨: {downloaded} / {total}",
"transferredSimple": "다운로드됨: {downloaded}",
"transferredUnknown": "다운로드됨: --",
"speed": "속도: {speed}"
} }
}, },
"move": { "move": {
@@ -1213,6 +1255,8 @@
"pauseFailed": "다운로드 일시정지 실패: {error}", "pauseFailed": "다운로드 일시정지 실패: {error}",
"downloadResumed": "다운로드가 재개되었습니다", "downloadResumed": "다운로드가 재개되었습니다",
"resumeFailed": "다운로드 재개 실패: {error}", "resumeFailed": "다운로드 재개 실패: {error}",
"downloadStopped": "다운로드가 취소되었습니다",
"stopFailed": "다운로드 취소 실패: {error}",
"deleted": "예시 이미지가 삭제되었습니다", "deleted": "예시 이미지가 삭제되었습니다",
"deleteFailed": "예시 이미지 삭제 실패", "deleteFailed": "예시 이미지 삭제 실패",
"setPreviewFailed": "미리보기 이미지 설정 실패" "setPreviewFailed": "미리보기 이미지 설정 실패"

View File

@@ -32,7 +32,7 @@
"korean": "한국어", "korean": "한국어",
"french": "Français", "french": "Français",
"spanish": "Español", "spanish": "Español",
"Hebrew": "עברית" "Hebrew": "עברית"
}, },
"fileSize": { "fileSize": {
"zero": "0 Байт", "zero": "0 Байт",
@@ -203,7 +203,8 @@
"exampleImages": "Примеры изображений", "exampleImages": "Примеры изображений",
"misc": "Разное", "misc": "Разное",
"metadataArchive": "Архив метаданных", "metadataArchive": "Архив метаданных",
"proxySettings": "Настройки прокси" "proxySettings": "Настройки прокси",
"priorityTags": "Приоритетные теги"
}, },
"contentFiltering": { "contentFiltering": {
"blurNsfwContent": "Размывать NSFW контент", "blurNsfwContent": "Размывать NSFW контент",
@@ -224,9 +225,9 @@
}, },
"displayDensityHelp": "Выберите количество карточек для отображения в ряду:", "displayDensityHelp": "Выберите количество карточек для отображения в ряду:",
"displayDensityDetails": { "displayDensityDetails": {
"default": "По умолчанию: 5 (1080p), 6 (2K), 8 (4K)", "default": "5 (1080p), 6 (2K), 8 (4K)",
"medium": "Средняя: 6 (1080p), 7 (2K), 9 (4K)", "medium": "6 (1080p), 7 (2K), 9 (4K)",
"compact": "Компактная: 7 (1080p), 8 (2K), 10 (4K)" "compact": "7 (1080p), 8 (2K), 10 (4K)"
}, },
"displayDensityWarning": "Предупреждение: Высокая плотность может вызвать проблемы с производительностью на системах с ограниченными ресурсами.", "displayDensityWarning": "Предупреждение: Высокая плотность может вызвать проблемы с производительностью на системах с ограниченными ресурсами.",
"cardInfoDisplay": "Отображение информации карточки", "cardInfoDisplay": "Отображение информации карточки",
@@ -236,8 +237,18 @@
}, },
"cardInfoDisplayHelp": "Выберите когда отображать информацию о модели и кнопки действий:", "cardInfoDisplayHelp": "Выберите когда отображать информацию о модели и кнопки действий:",
"cardInfoDisplayDetails": { "cardInfoDisplayDetails": {
"always": "Всегда видимо: Заголовки и подписи всегда видны", "always": "Заголовки и подписи всегда видны",
"hover": "Показать при наведении: Заголовки и подписи появляются только при наведении на карточку" "hover": "Заголовки и подписи появляются только при наведении на карточку"
},
"modelNameDisplay": "Отображение названия модели",
"modelNameDisplayOptions": {
"modelName": "Название модели",
"fileName": "Имя файла"
},
"modelNameDisplayHelp": "Выберите, что отображать в нижней части карточки модели:",
"modelNameDisplayDetails": {
"modelName": "Отображать описательное название модели",
"fileName": "Отображать фактическое имя файла на диске"
} }
}, },
"folderSettings": { "folderSettings": {
@@ -345,6 +356,26 @@
"proxyPassword": "Пароль (необязательно)", "proxyPassword": "Пароль (необязательно)",
"proxyPasswordPlaceholder": "пароль", "proxyPasswordPlaceholder": "пароль",
"proxyPasswordHelp": "Пароль для аутентификации на прокси (если требуется)" "proxyPasswordHelp": "Пароль для аутентификации на прокси (если требуется)"
},
"priorityTags": {
"title": "Приоритетные теги",
"description": "Настройте порядок приоритетов тегов для каждого типа моделей (например, character, concept, style(toon|toon_style)).",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "Открыть справку по приоритетным тегам",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Чекпойнт",
"embedding": "Эмбеддинг"
},
"saveSuccess": "Приоритетные теги обновлены.",
"saveError": "Не удалось обновить приоритетные теги.",
"loadingSuggestions": "Загрузка подсказок...",
"validation": {
"missingClosingParen": "В записи {index} отсутствует закрывающая скобка.",
"missingCanonical": "Запись {index} должна содержать каноническое имя тега.",
"duplicateCanonical": "Канонический тег \"{tag}\" встречается более одного раза.",
"unknown": "Недопустимая конфигурация приоритетных тегов."
}
} }
}, },
"loras": { "loras": {
@@ -538,7 +569,10 @@
"recursiveOn": "Искать во вложенных папках", "recursiveOn": "Искать во вложенных папках",
"recursiveOff": "Искать только в текущей папке", "recursiveOff": "Искать только в текущей папке",
"recursiveUnavailable": "Рекурсивный поиск доступен только в режиме дерева", "recursiveUnavailable": "Рекурсивный поиск доступен только в режиме дерева",
"collapseAllDisabled": "Недоступно в виде списка" "collapseAllDisabled": "Недоступно в виде списка",
"dragDrop": {
"unableToResolveRoot": "Не удалось определить путь назначения для перемещения."
}
}, },
"statistics": { "statistics": {
"title": "Статистика", "title": "Статистика",
@@ -613,6 +647,14 @@
"downloadedPreview": "Превью изображение загружено", "downloadedPreview": "Превью изображение загружено",
"downloadingFile": "Загрузка файла {type}", "downloadingFile": "Загрузка файла {type}",
"finalizing": "Завершение загрузки..." "finalizing": "Завершение загрузки..."
},
"progress": {
"currentFile": "Текущий файл:",
"downloading": "Скачивается: {name}",
"transferred": "Скачано: {downloaded} / {total}",
"transferredSimple": "Скачано: {downloaded}",
"transferredUnknown": "Скачано: --",
"speed": "Скорость: {speed}"
} }
}, },
"move": { "move": {
@@ -1213,6 +1255,8 @@
"pauseFailed": "Не удалось приостановить загрузку: {error}", "pauseFailed": "Не удалось приостановить загрузку: {error}",
"downloadResumed": "Загрузка возобновлена", "downloadResumed": "Загрузка возобновлена",
"resumeFailed": "Не удалось возобновить загрузку: {error}", "resumeFailed": "Не удалось возобновить загрузку: {error}",
"downloadStopped": "Загрузка отменена",
"stopFailed": "Не удалось отменить загрузку: {error}",
"deleted": "Пример изображения удален", "deleted": "Пример изображения удален",
"deleteFailed": "Не удалось удалить пример изображения", "deleteFailed": "Не удалось удалить пример изображения",
"setPreviewFailed": "Не удалось установить превью изображение" "setPreviewFailed": "Не удалось установить превью изображение"

View File

@@ -26,19 +26,13 @@
"english": "English", "english": "English",
"chinese_simplified": "中文(简体)", "chinese_simplified": "中文(简体)",
"chinese_traditional": "中文(繁体)", "chinese_traditional": "中文(繁体)",
"russian": "俄语",
"german": "德语",
"japanese": "日语",
"korean": "韩语",
"french": "法语",
"spanish": "西班牙语",
"Hebrew": "עברית",
"russian": "Русский", "russian": "Русский",
"german": "Deutsch", "german": "Deutsch",
"japanese": "日本語", "japanese": "日本語",
"korean": "한국어", "korean": "한국어",
"french": "Français", "french": "Français",
"spanish": "Español" "spanish": "Español",
"Hebrew": "עברית"
}, },
"fileSize": { "fileSize": {
"zero": "0 字节", "zero": "0 字节",
@@ -209,7 +203,8 @@
"exampleImages": "示例图片", "exampleImages": "示例图片",
"misc": "其他", "misc": "其他",
"metadataArchive": "元数据归档数据库", "metadataArchive": "元数据归档数据库",
"proxySettings": "代理设置" "proxySettings": "代理设置",
"priorityTags": "优先标签"
}, },
"contentFiltering": { "contentFiltering": {
"blurNsfwContent": "模糊 NSFW 内容", "blurNsfwContent": "模糊 NSFW 内容",
@@ -230,9 +225,9 @@
}, },
"displayDensityHelp": "选择每行显示卡片数量:", "displayDensityHelp": "选择每行显示卡片数量:",
"displayDensityDetails": { "displayDensityDetails": {
"default": "默认:51080p62K84K", "default": "51080p62K84K",
"medium": "中等:61080p72K94K", "medium": "61080p72K94K",
"compact": "紧凑:71080p82K104K" "compact": "71080p82K104K"
}, },
"displayDensityWarning": "警告:高密度可能导致资源有限的系统性能下降。", "displayDensityWarning": "警告:高密度可能导致资源有限的系统性能下降。",
"cardInfoDisplay": "卡片信息显示", "cardInfoDisplay": "卡片信息显示",
@@ -242,8 +237,18 @@
}, },
"cardInfoDisplayHelp": "选择何时显示模型信息和操作按钮:", "cardInfoDisplayHelp": "选择何时显示模型信息和操作按钮:",
"cardInfoDisplayDetails": { "cardInfoDisplayDetails": {
"always": "始终可见:标题和底部始终显示", "always": "标题和底部始终显示",
"hover": "悬停时显示:仅在悬停卡片时显示标题和底部" "hover": "仅在悬停卡片时显示标题和底部"
},
"modelNameDisplay": "模型名称显示",
"modelNameDisplayOptions": {
"modelName": "模型名称",
"fileName": "文件名"
},
"modelNameDisplayHelp": "选择在模型卡片底部显示的内容:",
"modelNameDisplayDetails": {
"modelName": "显示模型的描述性名称",
"fileName": "显示磁盘上的实际文件名"
} }
}, },
"folderSettings": { "folderSettings": {
@@ -351,6 +356,26 @@
"proxyPassword": "密码 (可选)", "proxyPassword": "密码 (可选)",
"proxyPasswordPlaceholder": "密码", "proxyPasswordPlaceholder": "密码",
"proxyPasswordHelp": "代理认证的密码 (如果需要)" "proxyPasswordHelp": "代理认证的密码 (如果需要)"
},
"priorityTags": {
"title": "优先标签",
"description": "为每种模型类型自定义标签优先级顺序 (例如: character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "打开优先标签帮助",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Checkpoint",
"embedding": "Embedding"
},
"saveSuccess": "优先标签已更新。",
"saveError": "优先标签更新失败。",
"loadingSuggestions": "正在加载建议...",
"validation": {
"missingClosingParen": "条目 {index} 缺少右括号。",
"missingCanonical": "条目 {index} 必须包含规范标签名称。",
"duplicateCanonical": "规范标签 \"{tag}\" 出现多次。",
"unknown": "优先标签配置无效。"
}
} }
}, },
"loras": { "loras": {
@@ -397,14 +422,14 @@
"selected": "已选中 {count} 项", "selected": "已选中 {count} 项",
"selectedSuffix": "已选中", "selectedSuffix": "已选中",
"viewSelected": "查看已选中", "viewSelected": "查看已选中",
"addTags": "为所添加标签", "addTags": "为所选中添加标签",
"setBaseModel": "为所设置基础模型", "setBaseModel": "为所选中设置基础模型",
"setContentRating": "为全部设置内容评级", "setContentRating": "为所选中设置内容评级",
"copyAll": "复制全部语法", "copyAll": "复制所选中语法",
"refreshAll": "刷新全部元数据", "refreshAll": "刷新所选中元数据",
"moveAll": "全部移动到文件夹", "moveAll": "移动所选中到文件夹",
"autoOrganize": "自动整理所选模型", "autoOrganize": "自动整理所选模型",
"deleteAll": "删除所有模型", "deleteAll": "删除选中模型",
"clear": "清除选择", "clear": "清除选择",
"autoOrganizeProgress": { "autoOrganizeProgress": {
"initializing": "正在初始化自动整理...", "initializing": "正在初始化自动整理...",
@@ -544,7 +569,10 @@
"recursiveOn": "搜索子文件夹", "recursiveOn": "搜索子文件夹",
"recursiveOff": "仅搜索当前文件夹", "recursiveOff": "仅搜索当前文件夹",
"recursiveUnavailable": "仅在树形视图中可使用递归搜索", "recursiveUnavailable": "仅在树形视图中可使用递归搜索",
"collapseAllDisabled": "列表视图下不可用" "collapseAllDisabled": "列表视图下不可用",
"dragDrop": {
"unableToResolveRoot": "无法确定移动的目标路径。"
}
}, },
"statistics": { "statistics": {
"title": "统计", "title": "统计",
@@ -619,6 +647,14 @@
"downloadedPreview": "预览图片已下载", "downloadedPreview": "预览图片已下载",
"downloadingFile": "正在下载 {type} 文件", "downloadingFile": "正在下载 {type} 文件",
"finalizing": "正在完成下载..." "finalizing": "正在完成下载..."
},
"progress": {
"currentFile": "当前文件:",
"downloading": "下载中:{name}",
"transferred": "已下载:{downloaded} / {total}",
"transferredSimple": "已下载:{downloaded}",
"transferredUnknown": "已下载:--",
"speed": "速度:{speed}"
} }
}, },
"move": { "move": {
@@ -1219,6 +1255,8 @@
"pauseFailed": "暂停下载失败:{error}", "pauseFailed": "暂停下载失败:{error}",
"downloadResumed": "下载已恢复", "downloadResumed": "下载已恢复",
"resumeFailed": "恢复下载失败:{error}", "resumeFailed": "恢复下载失败:{error}",
"downloadStopped": "下载已取消",
"stopFailed": "取消下载失败:{error}",
"deleted": "示例图片已删除", "deleted": "示例图片已删除",
"deleteFailed": "删除示例图片失败", "deleteFailed": "删除示例图片失败",
"setPreviewFailed": "设置预览图片失败" "setPreviewFailed": "设置预览图片失败"

View File

@@ -32,7 +32,7 @@
"korean": "한국어", "korean": "한국어",
"french": "Français", "french": "Français",
"spanish": "Español", "spanish": "Español",
"Hebrew": "עברית" "Hebrew": "עברית"
}, },
"fileSize": { "fileSize": {
"zero": "0 位元組", "zero": "0 位元組",
@@ -203,7 +203,8 @@
"exampleImages": "範例圖片", "exampleImages": "範例圖片",
"misc": "其他", "misc": "其他",
"metadataArchive": "中繼資料封存資料庫", "metadataArchive": "中繼資料封存資料庫",
"proxySettings": "代理設定" "proxySettings": "代理設定",
"priorityTags": "優先標籤"
}, },
"contentFiltering": { "contentFiltering": {
"blurNsfwContent": "模糊 NSFW 內容", "blurNsfwContent": "模糊 NSFW 內容",
@@ -224,9 +225,9 @@
}, },
"displayDensityHelp": "選擇每行顯示卡片數量:", "displayDensityHelp": "選擇每行顯示卡片數量:",
"displayDensityDetails": { "displayDensityDetails": {
"default": "預設:51080p、62K、84K", "default": "51080p、62K、84K",
"medium": "中等:61080p、72K、94K", "medium": "61080p、72K、94K",
"compact": "緊湊:71080p、82K、104K" "compact": "71080p、82K、104K"
}, },
"displayDensityWarning": "警告:較高密度可能導致資源有限的系統效能下降。", "displayDensityWarning": "警告:較高密度可能導致資源有限的系統效能下降。",
"cardInfoDisplay": "卡片資訊顯示", "cardInfoDisplay": "卡片資訊顯示",
@@ -236,8 +237,18 @@
}, },
"cardInfoDisplayHelp": "選擇何時顯示模型資訊與操作按鈕:", "cardInfoDisplayHelp": "選擇何時顯示模型資訊與操作按鈕:",
"cardInfoDisplayDetails": { "cardInfoDisplayDetails": {
"always": "永遠顯示:標題與頁腳始終可見", "always": "標題與頁腳始終可見",
"hover": "滑鼠懸停顯示:標題與頁腳僅在滑鼠懸停時顯示" "hover": "標題與頁腳僅在滑鼠懸停時顯示"
},
"modelNameDisplay": "模型名稱顯示",
"modelNameDisplayOptions": {
"modelName": "模型名稱",
"fileName": "檔案名稱"
},
"modelNameDisplayHelp": "選擇在模型卡片底部顯示的內容:",
"modelNameDisplayDetails": {
"modelName": "顯示模型的描述性名稱",
"fileName": "顯示磁碟上的實際檔案名稱"
} }
}, },
"folderSettings": { "folderSettings": {
@@ -345,6 +356,26 @@
"proxyPassword": "密碼(選填)", "proxyPassword": "密碼(選填)",
"proxyPasswordPlaceholder": "password", "proxyPasswordPlaceholder": "password",
"proxyPasswordHelp": "代理驗證所需的密碼(如有需要)" "proxyPasswordHelp": "代理驗證所需的密碼(如有需要)"
},
"priorityTags": {
"title": "優先標籤",
"description": "為每種模型類型自訂標籤的優先順序 (例如: character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "開啟優先標籤說明",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Checkpoint",
"embedding": "Embedding"
},
"saveSuccess": "優先標籤已更新。",
"saveError": "更新優先標籤失敗。",
"loadingSuggestions": "正在載入建議...",
"validation": {
"missingClosingParen": "項目 {index} 缺少右括號。",
"missingCanonical": "項目 {index} 必須包含正規標籤名稱。",
"duplicateCanonical": "正規標籤 \"{tag}\" 出現多於一次。",
"unknown": "優先標籤設定無效。"
}
} }
}, },
"loras": { "loras": {
@@ -538,7 +569,10 @@
"recursiveOn": "搜尋子資料夾", "recursiveOn": "搜尋子資料夾",
"recursiveOff": "僅搜尋目前資料夾", "recursiveOff": "僅搜尋目前資料夾",
"recursiveUnavailable": "遞迴搜尋僅能在樹狀檢視中使用", "recursiveUnavailable": "遞迴搜尋僅能在樹狀檢視中使用",
"collapseAllDisabled": "列表檢視下不可用" "collapseAllDisabled": "列表檢視下不可用",
"dragDrop": {
"unableToResolveRoot": "無法確定移動的目標路徑。"
}
}, },
"statistics": { "statistics": {
"title": "統計", "title": "統計",
@@ -613,6 +647,14 @@
"downloadedPreview": "已下載預覽圖片", "downloadedPreview": "已下載預覽圖片",
"downloadingFile": "正在下載 {type} 檔案", "downloadingFile": "正在下載 {type} 檔案",
"finalizing": "完成下載中..." "finalizing": "完成下載中..."
},
"progress": {
"currentFile": "目前檔案:",
"downloading": "下載中:{name}",
"transferred": "已下載:{downloaded} / {total}",
"transferredSimple": "已下載:{downloaded}",
"transferredUnknown": "已下載:--",
"speed": "速度:{speed}"
} }
}, },
"move": { "move": {
@@ -1213,6 +1255,8 @@
"pauseFailed": "暫停下載失敗:{error}", "pauseFailed": "暫停下載失敗:{error}",
"downloadResumed": "下載已恢復", "downloadResumed": "下載已恢復",
"resumeFailed": "恢復下載失敗:{error}", "resumeFailed": "恢復下載失敗:{error}",
"downloadStopped": "下載已取消",
"stopFailed": "取消下載失敗:{error}",
"deleted": "範例圖片已刪除", "deleted": "範例圖片已刪除",
"deleteFailed": "刪除範例圖片失敗", "deleteFailed": "刪除範例圖片失敗",
"setPreviewFailed": "設定預覽圖片失敗" "setPreviewFailed": "設定預覽圖片失敗"

View File

@@ -666,6 +666,7 @@ NODE_EXTRACTORS = {
"LoraManagerLoader": LoraLoaderManagerExtractor, "LoraManagerLoader": LoraLoaderManagerExtractor,
# Conditioning # Conditioning
"CLIPTextEncode": CLIPTextEncodeExtractor, "CLIPTextEncode": CLIPTextEncodeExtractor,
"PromptLoraManager": CLIPTextEncodeExtractor,
"CLIPTextEncodeFlux": CLIPTextEncodeFluxExtractor, # Add CLIPTextEncodeFlux "CLIPTextEncodeFlux": CLIPTextEncodeFluxExtractor, # Add CLIPTextEncodeFlux
"WAS_Text_to_Conditioning": CLIPTextEncodeExtractor, "WAS_Text_to_Conditioning": CLIPTextEncodeExtractor,
"AdvancedCLIPTextEncode": CLIPTextEncodeExtractor, # From https://github.com/BlenderNeko/ComfyUI_ADV_CLIP_emb "AdvancedCLIPTextEncode": CLIPTextEncodeExtractor, # From https://github.com/BlenderNeko/ComfyUI_ADV_CLIP_emb

View File

@@ -1,7 +1,6 @@
import logging import logging
import re import re
from nodes import LoraLoader from nodes import LoraLoader
from comfy.comfy_types import IO # type: ignore
from ..utils.utils import get_lora_info from ..utils.utils import get_lora_info
from .utils import FlexibleOptionalInputType, any_type, extract_lora_name, get_loras_list, nunchaku_load_lora from .utils import FlexibleOptionalInputType, any_type, extract_lora_name, get_loras_list, nunchaku_load_lora
@@ -17,7 +16,7 @@ class LoraManagerLoader:
"required": { "required": {
"model": ("MODEL",), "model": ("MODEL",),
# "clip": ("CLIP",), # "clip": ("CLIP",),
"text": (IO.STRING, { "text": ("STRING", {
"multiline": True, "multiline": True,
"pysssss.autocomplete": False, "pysssss.autocomplete": False,
"dynamicPrompts": True, "dynamicPrompts": True,
@@ -28,7 +27,7 @@ class LoraManagerLoader:
"optional": FlexibleOptionalInputType(any_type), "optional": FlexibleOptionalInputType(any_type),
} }
RETURN_TYPES = ("MODEL", "CLIP", IO.STRING, IO.STRING) RETURN_TYPES = ("MODEL", "CLIP", "STRING", "STRING")
RETURN_NAMES = ("MODEL", "CLIP", "trigger_words", "loaded_loras") RETURN_NAMES = ("MODEL", "CLIP", "trigger_words", "loaded_loras")
FUNCTION = "load_loras" FUNCTION = "load_loras"
@@ -141,7 +140,7 @@ class LoraManagerTextLoader:
return { return {
"required": { "required": {
"model": ("MODEL",), "model": ("MODEL",),
"lora_syntax": (IO.STRING, { "lora_syntax": ("STRING", {
"defaultInput": True, "defaultInput": True,
"forceInput": True, "forceInput": True,
"tooltip": "Format: <lora:lora_name:strength> separated by spaces or punctuation" "tooltip": "Format: <lora:lora_name:strength> separated by spaces or punctuation"
@@ -153,7 +152,7 @@ class LoraManagerTextLoader:
} }
} }
RETURN_TYPES = ("MODEL", "CLIP", IO.STRING, IO.STRING) RETURN_TYPES = ("MODEL", "CLIP", "STRING", "STRING")
RETURN_NAMES = ("MODEL", "CLIP", "trigger_words", "loaded_loras") RETURN_NAMES = ("MODEL", "CLIP", "trigger_words", "loaded_loras")
FUNCTION = "load_loras_from_text" FUNCTION = "load_loras_from_text"

View File

@@ -1,4 +1,3 @@
from comfy.comfy_types import IO # type: ignore
import os import os
from ..utils.utils import get_lora_info from ..utils.utils import get_lora_info
from .utils import FlexibleOptionalInputType, any_type, extract_lora_name, get_loras_list from .utils import FlexibleOptionalInputType, any_type, extract_lora_name, get_loras_list
@@ -15,7 +14,7 @@ class LoraStacker:
def INPUT_TYPES(cls): def INPUT_TYPES(cls):
return { return {
"required": { "required": {
"text": (IO.STRING, { "text": ("STRING", {
"multiline": True, "multiline": True,
"pysssss.autocomplete": False, "pysssss.autocomplete": False,
"dynamicPrompts": True, "dynamicPrompts": True,
@@ -26,7 +25,7 @@ class LoraStacker:
"optional": FlexibleOptionalInputType(any_type), "optional": FlexibleOptionalInputType(any_type),
} }
RETURN_TYPES = ("LORA_STACK", IO.STRING, IO.STRING) RETURN_TYPES = ("LORA_STACK", "STRING", "STRING")
RETURN_NAMES = ("LORA_STACK", "trigger_words", "active_loras") RETURN_NAMES = ("LORA_STACK", "trigger_words", "active_loras")
FUNCTION = "stack_loras" FUNCTION = "stack_loras"

59
py/nodes/prompt.py Normal file
View File

@@ -0,0 +1,59 @@
from typing import Any, Optional
class PromptLoraManager:
"""Encodes text (and optional trigger words) into CLIP conditioning."""
NAME = "Prompt (LoraManager)"
CATEGORY = "Lora Manager/conditioning"
DESCRIPTION = (
"Encodes a text prompt using a CLIP model into an embedding that can be used "
"to guide the diffusion model towards generating specific images."
)
@classmethod
def INPUT_TYPES(cls):
return {
"required": {
"text": (
'STRING',
{
"multiline": True,
"pysssss.autocomplete": False,
"dynamicPrompts": True,
"tooltip": "The text to be encoded.",
},
),
"clip": (
'CLIP',
{"tooltip": "The CLIP model used for encoding the text."},
),
},
"optional": {
"trigger_words": (
'STRING',
{
"forceInput": True,
"tooltip": (
"Optional trigger words to prepend to the text before "
"encoding."
)
},
)
},
}
RETURN_TYPES = ('CONDITIONING', 'STRING',)
RETURN_NAMES = ('CONDITIONING', 'PROMPT',)
OUTPUT_TOOLTIPS = (
"A conditioning containing the embedded text used to guide the diffusion model.",
)
FUNCTION = "encode"
def encode(self, text: str, clip: Any, trigger_words: Optional[str] = None):
prompt = text
if trigger_words:
prompt = ", ".join([trigger_words, text])
from nodes import CLIPTextEncode # type: ignore
conditioning = CLIPTextEncode().encode(clip, prompt)[0]
return (conditioning, prompt,)

View File

@@ -1,6 +1,5 @@
import json import json
import re import re
from server import PromptServer # type: ignore
from .utils import FlexibleOptionalInputType, any_type from .utils import FlexibleOptionalInputType, any_type
import logging import logging

View File

@@ -1,4 +1,3 @@
from comfy.comfy_types import IO # type: ignore
import folder_paths # type: ignore import folder_paths # type: ignore
from ..utils.utils import get_lora_info from ..utils.utils import get_lora_info
from .utils import FlexibleOptionalInputType, any_type, get_loras_list from .utils import FlexibleOptionalInputType, any_type, get_loras_list
@@ -16,7 +15,7 @@ class WanVideoLoraSelect:
"required": { "required": {
"low_mem_load": ("BOOLEAN", {"default": False, "tooltip": "Load LORA models with less VRAM usage, slower loading. This affects ALL LoRAs, not just the current ones. No effect if merge_loras is False"}), "low_mem_load": ("BOOLEAN", {"default": False, "tooltip": "Load LORA models with less VRAM usage, slower loading. This affects ALL LoRAs, not just the current ones. No effect if merge_loras is False"}),
"merge_loras": ("BOOLEAN", {"default": True, "tooltip": "Merge LoRAs into the model, otherwise they are loaded on the fly. Always disabled for GGUF and scaled fp8 models. This affects ALL LoRAs, not just the current one"}), "merge_loras": ("BOOLEAN", {"default": True, "tooltip": "Merge LoRAs into the model, otherwise they are loaded on the fly. Always disabled for GGUF and scaled fp8 models. This affects ALL LoRAs, not just the current one"}),
"text": (IO.STRING, { "text": ("STRING", {
"multiline": True, "multiline": True,
"pysssss.autocomplete": False, "pysssss.autocomplete": False,
"dynamicPrompts": True, "dynamicPrompts": True,
@@ -27,7 +26,7 @@ class WanVideoLoraSelect:
"optional": FlexibleOptionalInputType(any_type), "optional": FlexibleOptionalInputType(any_type),
} }
RETURN_TYPES = ("WANVIDLORA", IO.STRING, IO.STRING) RETURN_TYPES = ("WANVIDLORA", "STRING", "STRING")
RETURN_NAMES = ("lora", "trigger_words", "active_loras") RETURN_NAMES = ("lora", "trigger_words", "active_loras")
FUNCTION = "process_loras" FUNCTION = "process_loras"

View File

@@ -1,5 +1,4 @@
from comfy.comfy_types import IO import folder_paths # type: ignore
import folder_paths
from ..utils.utils import get_lora_info from ..utils.utils import get_lora_info
from .utils import any_type from .utils import any_type
import logging import logging
@@ -20,7 +19,7 @@ class WanVideoLoraSelectFromText:
"required": { "required": {
"low_mem_load": ("BOOLEAN", {"default": False, "tooltip": "Load LORA models with less VRAM usage, slower loading. This affects ALL LoRAs, not just the current ones. No effect if merge_loras is False"}), "low_mem_load": ("BOOLEAN", {"default": False, "tooltip": "Load LORA models with less VRAM usage, slower loading. This affects ALL LoRAs, not just the current ones. No effect if merge_loras is False"}),
"merge_lora": ("BOOLEAN", {"default": True, "tooltip": "Merge LoRAs into the model, otherwise they are loaded on the fly. Always disabled for GGUF and scaled fp8 models. This affects ALL LoRAs, not just the current one"}), "merge_lora": ("BOOLEAN", {"default": True, "tooltip": "Merge LoRAs into the model, otherwise they are loaded on the fly. Always disabled for GGUF and scaled fp8 models. This affects ALL LoRAs, not just the current one"}),
"lora_syntax": (IO.STRING, { "lora_syntax": ("STRING", {
"multiline": True, "multiline": True,
"defaultInput": True, "defaultInput": True,
"forceInput": True, "forceInput": True,
@@ -34,7 +33,7 @@ class WanVideoLoraSelectFromText:
} }
} }
RETURN_TYPES = ("WANVIDLORA", IO.STRING, IO.STRING) RETURN_TYPES = ("WANVIDLORA", "STRING", "STRING")
RETURN_NAMES = ("lora", "trigger_words", "active_loras") RETURN_NAMES = ("lora", "trigger_words", "active_loras")
FUNCTION = "process_loras_from_syntax" FUNCTION = "process_loras_from_syntax"

View File

@@ -2,7 +2,7 @@ from __future__ import annotations
import logging import logging
from abc import ABC, abstractmethod from abc import ABC, abstractmethod
from typing import Callable, Dict, Mapping from typing import TYPE_CHECKING, Callable, Dict, Mapping
import jinja2 import jinja2
from aiohttp import web from aiohttp import web
@@ -42,8 +42,12 @@ from .handlers.model_handlers import (
ModelMoveHandler, ModelMoveHandler,
ModelPageView, ModelPageView,
ModelQueryHandler, ModelQueryHandler,
ModelUpdateHandler,
) )
if TYPE_CHECKING:
from ..services.model_update_service import ModelUpdateService
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -99,10 +103,18 @@ class BaseModelRoutes(ABC):
ws_manager=self._ws_manager, ws_manager=self._ws_manager,
download_manager_factory=ServiceRegistry.get_download_manager, download_manager_factory=ServiceRegistry.get_download_manager,
) )
self._model_update_service: ModelUpdateService | None = None
if service is not None: if service is not None:
self.attach_service(service) self.attach_service(service)
def set_model_update_service(self, service: "ModelUpdateService") -> None:
"""Attach the model update tracking service."""
self._model_update_service = service
self._handler_set = None
self._handler_mapping = None
def attach_service(self, service) -> None: def attach_service(self, service) -> None:
"""Attach a model service and rebuild handler dependencies.""" """Attach a model service and rebuild handler dependencies."""
self.service = service self.service = service
@@ -127,6 +139,7 @@ class BaseModelRoutes(ABC):
def _create_handler_set(self) -> ModelHandlerSet: def _create_handler_set(self) -> ModelHandlerSet:
service = self._ensure_service() service = self._ensure_service()
update_service = self._ensure_model_update_service()
page_view = ModelPageView( page_view = ModelPageView(
template_env=self.template_env, template_env=self.template_env,
template_name=self.template_name or "", template_name=self.template_name or "",
@@ -186,6 +199,12 @@ class BaseModelRoutes(ABC):
ws_manager=self._ws_manager, ws_manager=self._ws_manager,
logger=logger, logger=logger,
) )
updates = ModelUpdateHandler(
service=service,
update_service=update_service,
metadata_provider_selector=get_metadata_provider,
logger=logger,
)
return ModelHandlerSet( return ModelHandlerSet(
page_view=page_view, page_view=page_view,
listing=listing, listing=listing,
@@ -195,6 +214,7 @@ class BaseModelRoutes(ABC):
civitai=civitai, civitai=civitai,
move=move, move=move,
auto_organize=auto_organize, auto_organize=auto_organize,
updates=updates,
) )
@property @property
@@ -273,3 +293,8 @@ class BaseModelRoutes(ABC):
return proxy return proxy
def _ensure_model_update_service(self) -> "ModelUpdateService":
if self._model_update_service is None:
raise RuntimeError("Model update service has not been attached")
return self._model_update_service

View File

@@ -20,7 +20,9 @@ class CheckpointRoutes(BaseModelRoutes):
async def initialize_services(self): async def initialize_services(self):
"""Initialize services from ServiceRegistry""" """Initialize services from ServiceRegistry"""
checkpoint_scanner = await ServiceRegistry.get_checkpoint_scanner() checkpoint_scanner = await ServiceRegistry.get_checkpoint_scanner()
self.service = CheckpointService(checkpoint_scanner) update_service = await ServiceRegistry.get_model_update_service()
self.service = CheckpointService(checkpoint_scanner, update_service=update_service)
self.set_model_update_service(update_service)
# Attach service dependencies # Attach service dependencies
self.attach_service(self.service) self.attach_service(self.service)

View File

@@ -19,7 +19,9 @@ class EmbeddingRoutes(BaseModelRoutes):
async def initialize_services(self): async def initialize_services(self):
"""Initialize services from ServiceRegistry""" """Initialize services from ServiceRegistry"""
embedding_scanner = await ServiceRegistry.get_embedding_scanner() embedding_scanner = await ServiceRegistry.get_embedding_scanner()
self.service = EmbeddingService(embedding_scanner) update_service = await ServiceRegistry.get_model_update_service()
self.service = EmbeddingService(embedding_scanner, update_service=update_service)
self.set_model_update_service(update_service)
# Attach service dependencies # Attach service dependencies
self.attach_service(self.service) self.attach_service(self.service)

View File

@@ -22,6 +22,7 @@ ROUTE_DEFINITIONS: tuple[RouteDefinition, ...] = (
RouteDefinition("GET", "/api/lm/example-images-status", "get_example_images_status"), RouteDefinition("GET", "/api/lm/example-images-status", "get_example_images_status"),
RouteDefinition("POST", "/api/lm/pause-example-images", "pause_example_images"), RouteDefinition("POST", "/api/lm/pause-example-images", "pause_example_images"),
RouteDefinition("POST", "/api/lm/resume-example-images", "resume_example_images"), RouteDefinition("POST", "/api/lm/resume-example-images", "resume_example_images"),
RouteDefinition("POST", "/api/lm/stop-example-images", "stop_example_images"),
RouteDefinition("POST", "/api/lm/open-example-images-folder", "open_example_images_folder"), RouteDefinition("POST", "/api/lm/open-example-images-folder", "open_example_images_folder"),
RouteDefinition("GET", "/api/lm/example-image-files", "get_example_image_files"), RouteDefinition("GET", "/api/lm/example-image-files", "get_example_image_files"),
RouteDefinition("GET", "/api/lm/has-example-images", "has_example_images"), RouteDefinition("GET", "/api/lm/has-example-images", "has_example_images"),

View File

@@ -68,6 +68,13 @@ class ExampleImagesDownloadHandler:
except DownloadNotRunningError as exc: except DownloadNotRunningError as exc:
return web.json_response({'success': False, 'error': str(exc)}, status=400) return web.json_response({'success': False, 'error': str(exc)}, status=400)
async def stop_example_images(self, request: web.Request) -> web.StreamResponse:
try:
result = await self._download_manager.stop_download(request)
return web.json_response(result)
except DownloadNotRunningError as exc:
return web.json_response({'success': False, 'error': str(exc)}, status=400)
async def force_download_example_images(self, request: web.Request) -> web.StreamResponse: async def force_download_example_images(self, request: web.Request) -> web.StreamResponse:
try: try:
payload = await request.json() payload = await request.json()
@@ -149,6 +156,7 @@ class ExampleImagesHandlerSet:
"get_example_images_status": self.download.get_example_images_status, "get_example_images_status": self.download.get_example_images_status,
"pause_example_images": self.download.pause_example_images, "pause_example_images": self.download.pause_example_images,
"resume_example_images": self.download.resume_example_images, "resume_example_images": self.download.resume_example_images,
"stop_example_images": self.download.stop_example_images,
"force_download_example_images": self.download.force_download_example_images, "force_download_example_images": self.download.force_download_example_images,
"import_example_images": self.management.import_example_images, "import_example_images": self.management.import_example_images,
"delete_example_image": self.management.delete_example_image, "delete_example_image": self.management.delete_example_image,

View File

@@ -162,6 +162,8 @@ class SettingsHandler:
"include_trigger_words", "include_trigger_words",
"show_only_sfw", "show_only_sfw",
"compact_mode", "compact_mode",
"priority_tags",
"model_name_display",
) )
_PROXY_KEYS = {"proxy_enabled", "proxy_host", "proxy_port", "proxy_username", "proxy_password", "proxy_type"} _PROXY_KEYS = {"proxy_enabled", "proxy_host", "proxy_port", "proxy_username", "proxy_password", "proxy_type"}
@@ -207,6 +209,14 @@ class SettingsHandler:
logger.error("Error getting settings: %s", exc, exc_info=True) logger.error("Error getting settings: %s", exc, exc_info=True)
return web.json_response({"success": False, "error": str(exc)}, status=500) return web.json_response({"success": False, "error": str(exc)}, status=500)
async def get_priority_tags(self, request: web.Request) -> web.Response:
try:
suggestions = self._settings.get_priority_tag_suggestions()
return web.json_response({"success": True, "tags": suggestions})
except Exception as exc: # pragma: no cover - defensive logging
logger.error("Error getting priority tags: %s", exc, exc_info=True)
return web.json_response({"success": False, "error": str(exc)}, status=500)
async def activate_library(self, request: web.Request) -> web.Response: async def activate_library(self, request: web.Request) -> web.Response:
"""Activate the selected library.""" """Activate the selected library."""
@@ -942,6 +952,7 @@ class MiscHandlerSet:
"health_check": self.health.health_check, "health_check": self.health.health_check,
"get_settings": self.settings.get_settings, "get_settings": self.settings.get_settings,
"update_settings": self.settings.update_settings, "update_settings": self.settings.update_settings,
"get_priority_tags": self.settings.get_priority_tags,
"get_settings_libraries": self.settings.get_libraries, "get_settings_libraries": self.settings.get_libraries,
"activate_library": self.settings.activate_library, "activate_library": self.settings.activate_library,
"update_usage_stats": self.usage_stats.update_usage_stats, "update_usage_stats": self.usage_stats.update_usage_stats,

View File

@@ -6,7 +6,7 @@ import json
import logging import logging
import os import os
from dataclasses import dataclass from dataclasses import dataclass
from typing import Awaitable, Callable, Dict, Iterable, Mapping, Optional from typing import Awaitable, Callable, Dict, Iterable, List, Mapping, Optional
from aiohttp import web from aiohttp import web
import jinja2 import jinja2
@@ -29,6 +29,7 @@ from ...services.use_cases import (
) )
from ...services.websocket_manager import WebSocketManager from ...services.websocket_manager import WebSocketManager
from ...services.websocket_progress_callback import WebSocketProgressCallback from ...services.websocket_progress_callback import WebSocketProgressCallback
from ...services.errors import RateLimitError
from ...utils.file_utils import calculate_sha256 from ...utils.file_utils import calculate_sha256
from ...utils.metadata_manager import MetadataManager from ...utils.metadata_manager import MetadataManager
@@ -165,6 +166,11 @@ class ModelListingHandler:
except (json.JSONDecodeError, TypeError): except (json.JSONDecodeError, TypeError):
pass pass
has_update = request.query.get("has_update", "false")
has_update_filter = (
has_update.lower() in {"1", "true", "yes"} if isinstance(has_update, str) else False
)
return { return {
"page": page, "page": page,
"page_size": page_size, "page_size": page_size,
@@ -177,6 +183,7 @@ class ModelListingHandler:
"search_options": search_options, "search_options": search_options,
"hash_filters": hash_filters, "hash_filters": hash_filters,
"favorites_only": favorites_only, "favorites_only": favorites_only,
"has_update": has_update_filter,
**self._parse_specific_params(request), **self._parse_specific_params(request),
} }
@@ -758,6 +765,30 @@ class ModelDownloadHandler:
self._logger.error("Error cancelling download via GET: %s", exc, exc_info=True) self._logger.error("Error cancelling download via GET: %s", exc, exc_info=True)
return web.json_response({"success": False, "error": str(exc)}, status=500) return web.json_response({"success": False, "error": str(exc)}, status=500)
async def pause_download_get(self, request: web.Request) -> web.Response:
try:
download_id = request.query.get("download_id")
if not download_id:
return web.json_response({"success": False, "error": "Download ID is required"}, status=400)
result = await self._download_coordinator.pause_download(download_id)
status = 200 if result.get("success") else 400
return web.json_response(result, status=status)
except Exception as exc:
self._logger.error("Error pausing download via GET: %s", exc, exc_info=True)
return web.json_response({"success": False, "error": str(exc)}, status=500)
async def resume_download_get(self, request: web.Request) -> web.Response:
try:
download_id = request.query.get("download_id")
if not download_id:
return web.json_response({"success": False, "error": "Download ID is required"}, status=400)
result = await self._download_coordinator.resume_download(download_id)
status = 200 if result.get("success") else 400
return web.json_response(result, status=status)
except Exception as exc:
self._logger.error("Error resuming download via GET: %s", exc, exc_info=True)
return web.json_response({"success": False, "error": str(exc)}, status=500)
async def get_download_progress(self, request: web.Request) -> web.Response: async def get_download_progress(self, request: web.Request) -> web.Response:
try: try:
download_id = request.match_info.get("download_id") download_id = request.match_info.get("download_id")
@@ -766,7 +797,23 @@ class ModelDownloadHandler:
progress_data = self._ws_manager.get_download_progress(download_id) progress_data = self._ws_manager.get_download_progress(download_id)
if progress_data is None: if progress_data is None:
return web.json_response({"success": False, "error": "Download ID not found"}, status=404) return web.json_response({"success": False, "error": "Download ID not found"}, status=404)
return web.json_response({"success": True, "progress": progress_data.get("progress", 0)}) response_payload = {
"success": True,
"progress": progress_data.get("progress", 0),
"bytes_downloaded": progress_data.get("bytes_downloaded"),
"total_bytes": progress_data.get("total_bytes"),
"bytes_per_second": progress_data.get("bytes_per_second", 0.0),
}
status = progress_data.get("status")
if status and status != "progress":
response_payload["status"] = status
if "message" in progress_data:
response_payload["message"] = progress_data["message"]
elif status is None and "message" in progress_data:
response_payload["message"] = progress_data["message"]
return web.json_response(response_payload)
except Exception as exc: except Exception as exc:
self._logger.error("Error getting download progress: %s", exc, exc_info=True) self._logger.error("Error getting download progress: %s", exc, exc_info=True)
return web.json_response({"success": False, "error": str(exc)}, status=500) return web.json_response({"success": False, "error": str(exc)}, status=500)
@@ -977,6 +1024,156 @@ class ModelAutoOrganizeHandler:
return web.json_response({"success": False, "error": str(exc)}, status=500) return web.json_response({"success": False, "error": str(exc)}, status=500)
class ModelUpdateHandler:
"""Handle update tracking requests."""
def __init__(
self,
*,
service,
update_service,
metadata_provider_selector,
logger: logging.Logger,
) -> None:
self._service = service
self._update_service = update_service
self._metadata_provider_selector = metadata_provider_selector
self._logger = logger
async def refresh_model_updates(self, request: web.Request) -> web.Response:
payload = await self._read_json(request)
force_refresh = self._parse_bool(request.query.get("force")) or self._parse_bool(
payload.get("force")
)
provider = await self._get_civitai_provider()
if provider is None:
return web.json_response(
{"success": False, "error": "Civitai provider not available"}, status=503
)
try:
records = await self._update_service.refresh_for_model_type(
self._service.model_type,
self._service.scanner,
provider,
force_refresh=force_refresh,
)
except RateLimitError as exc:
return web.json_response(
{"success": False, "error": str(exc) or "Rate limited"}, status=429
)
except Exception as exc: # pragma: no cover - defensive logging
self._logger.error("Failed to refresh model updates: %s", exc, exc_info=True)
return web.json_response({"success": False, "error": str(exc)}, status=500)
return web.json_response(
{
"success": True,
"records": [self._serialize_record(record) for record in records.values()],
}
)
async def set_model_update_ignore(self, request: web.Request) -> web.Response:
payload = await self._read_json(request)
model_id = self._normalize_model_id(payload.get("modelId"))
if model_id is None:
return web.json_response({"success": False, "error": "modelId is required"}, status=400)
should_ignore = self._parse_bool(payload.get("shouldIgnore"))
record = await self._update_service.set_should_ignore(
self._service.model_type, model_id, should_ignore
)
return web.json_response({"success": True, "record": self._serialize_record(record)})
async def get_model_update_status(self, request: web.Request) -> web.Response:
model_id = self._normalize_model_id(request.match_info.get("model_id"))
if model_id is None:
return web.json_response({"success": False, "error": "model_id must be an integer"}, status=400)
refresh = self._parse_bool(request.query.get("refresh"))
force = self._parse_bool(request.query.get("force"))
try:
record = await self._get_or_refresh_record(model_id, refresh=refresh, force=force)
except RateLimitError as exc:
return web.json_response(
{"success": False, "error": str(exc) or "Rate limited"}, status=429
)
if record is None:
return web.json_response(
{"success": False, "error": "Model not tracked"}, status=404
)
return web.json_response({"success": True, "record": self._serialize_record(record)})
async def _get_or_refresh_record(
self, model_id: int, *, refresh: bool, force: bool
) -> Optional[object]:
record = await self._update_service.get_record(self._service.model_type, model_id)
if record and not refresh and not force:
return record
provider = await self._get_civitai_provider()
if provider is None:
return record
return await self._update_service.refresh_single_model(
self._service.model_type,
model_id,
self._service.scanner,
provider,
force_refresh=force or refresh,
)
async def _get_civitai_provider(self):
try:
return await self._metadata_provider_selector("civitai_api")
except Exception as exc: # pragma: no cover - defensive log
self._logger.error("Failed to acquire civitai provider: %s", exc, exc_info=True)
return None
async def _read_json(self, request: web.Request) -> Dict:
if not request.can_read_body:
return {}
try:
return await request.json()
except Exception:
return {}
@staticmethod
def _parse_bool(value) -> bool:
if isinstance(value, bool):
return value
if isinstance(value, str):
return value.lower() in {"1", "true", "yes"}
if isinstance(value, (int, float)):
return bool(value)
return False
@staticmethod
def _normalize_model_id(value) -> Optional[int]:
try:
if value is None:
return None
return int(value)
except (TypeError, ValueError):
return None
@staticmethod
def _serialize_record(record) -> Dict:
return {
"modelType": record.model_type,
"modelId": record.model_id,
"largestVersionId": record.largest_version_id,
"versionIds": record.version_ids,
"inLibraryVersionIds": record.in_library_version_ids,
"lastCheckedAt": record.last_checked_at,
"shouldIgnore": record.should_ignore,
"hasUpdate": record.has_update(),
}
@dataclass @dataclass
class ModelHandlerSet: class ModelHandlerSet:
"""Aggregate concrete handlers into a flat mapping.""" """Aggregate concrete handlers into a flat mapping."""
@@ -989,6 +1186,7 @@ class ModelHandlerSet:
civitai: ModelCivitaiHandler civitai: ModelCivitaiHandler
move: ModelMoveHandler move: ModelMoveHandler
auto_organize: ModelAutoOrganizeHandler auto_organize: ModelAutoOrganizeHandler
updates: ModelUpdateHandler
def to_route_mapping(self) -> Dict[str, Callable[[web.Request], Awaitable[web.Response]]]: def to_route_mapping(self) -> Dict[str, Callable[[web.Request], Awaitable[web.Response]]]:
return { return {
@@ -1017,6 +1215,8 @@ class ModelHandlerSet:
"download_model": self.download.download_model, "download_model": self.download.download_model,
"download_model_get": self.download.download_model_get, "download_model_get": self.download.download_model_get,
"cancel_download_get": self.download.cancel_download_get, "cancel_download_get": self.download.cancel_download_get,
"pause_download_get": self.download.pause_download_get,
"resume_download_get": self.download.resume_download_get,
"get_download_progress": self.download.get_download_progress, "get_download_progress": self.download.get_download_progress,
"get_civitai_versions": self.civitai.get_civitai_versions, "get_civitai_versions": self.civitai.get_civitai_versions,
"get_civitai_model_by_version": self.civitai.get_civitai_model_by_version, "get_civitai_model_by_version": self.civitai.get_civitai_model_by_version,
@@ -1031,5 +1231,8 @@ class ModelHandlerSet:
"get_model_metadata": self.query.get_model_metadata, "get_model_metadata": self.query.get_model_metadata,
"get_model_description": self.query.get_model_description, "get_model_description": self.query.get_model_description,
"get_relative_paths": self.query.get_relative_paths, "get_relative_paths": self.query.get_relative_paths,
"refresh_model_updates": self.updates.refresh_model_updates,
"set_model_update_ignore": self.updates.set_model_update_ignore,
"get_model_update_status": self.updates.get_model_update_status,
} }

View File

@@ -23,7 +23,9 @@ class LoraRoutes(BaseModelRoutes):
async def initialize_services(self): async def initialize_services(self):
"""Initialize services from ServiceRegistry""" """Initialize services from ServiceRegistry"""
lora_scanner = await ServiceRegistry.get_lora_scanner() lora_scanner = await ServiceRegistry.get_lora_scanner()
self.service = LoraService(lora_scanner) update_service = await ServiceRegistry.get_model_update_service()
self.service = LoraService(lora_scanner, update_service=update_service)
self.set_model_update_service(update_service)
# Attach service dependencies # Attach service dependencies
self.attach_service(self.service) self.attach_service(self.service)

View File

@@ -22,6 +22,7 @@ class RouteDefinition:
MISC_ROUTE_DEFINITIONS: tuple[RouteDefinition, ...] = ( MISC_ROUTE_DEFINITIONS: tuple[RouteDefinition, ...] = (
RouteDefinition("GET", "/api/lm/settings", "get_settings"), RouteDefinition("GET", "/api/lm/settings", "get_settings"),
RouteDefinition("POST", "/api/lm/settings", "update_settings"), RouteDefinition("POST", "/api/lm/settings", "update_settings"),
RouteDefinition("GET", "/api/lm/priority-tags", "get_priority_tags"),
RouteDefinition("GET", "/api/lm/settings/libraries", "get_settings_libraries"), RouteDefinition("GET", "/api/lm/settings/libraries", "get_settings_libraries"),
RouteDefinition("POST", "/api/lm/settings/libraries/activate", "activate_library"), RouteDefinition("POST", "/api/lm/settings/libraries/activate", "activate_library"),
RouteDefinition("GET", "/api/lm/health-check", "health_check"), RouteDefinition("GET", "/api/lm/health-check", "health_check"),

View File

@@ -55,9 +55,14 @@ COMMON_ROUTE_DEFINITIONS: tuple[RouteDefinition, ...] = (
RouteDefinition("GET", "/api/lm/{prefix}/civitai/versions/{model_id}", "get_civitai_versions"), RouteDefinition("GET", "/api/lm/{prefix}/civitai/versions/{model_id}", "get_civitai_versions"),
RouteDefinition("GET", "/api/lm/{prefix}/civitai/model/version/{modelVersionId}", "get_civitai_model_by_version"), RouteDefinition("GET", "/api/lm/{prefix}/civitai/model/version/{modelVersionId}", "get_civitai_model_by_version"),
RouteDefinition("GET", "/api/lm/{prefix}/civitai/model/hash/{hash}", "get_civitai_model_by_hash"), RouteDefinition("GET", "/api/lm/{prefix}/civitai/model/hash/{hash}", "get_civitai_model_by_hash"),
RouteDefinition("POST", "/api/lm/{prefix}/updates/refresh", "refresh_model_updates"),
RouteDefinition("POST", "/api/lm/{prefix}/updates/ignore", "set_model_update_ignore"),
RouteDefinition("GET", "/api/lm/{prefix}/updates/status/{model_id}", "get_model_update_status"),
RouteDefinition("POST", "/api/lm/download-model", "download_model"), RouteDefinition("POST", "/api/lm/download-model", "download_model"),
RouteDefinition("GET", "/api/lm/download-model-get", "download_model_get"), RouteDefinition("GET", "/api/lm/download-model-get", "download_model_get"),
RouteDefinition("GET", "/api/lm/cancel-download-get", "cancel_download_get"), RouteDefinition("GET", "/api/lm/cancel-download-get", "cancel_download_get"),
RouteDefinition("GET", "/api/lm/pause-download", "pause_download_get"),
RouteDefinition("GET", "/api/lm/resume-download", "resume_download_get"),
RouteDefinition("GET", "/api/lm/download-progress/{download_id}", "get_download_progress"), RouteDefinition("GET", "/api/lm/download-progress/{download_id}", "get_download_progress"),
RouteDefinition("GET", "/{prefix}", "handle_models_page"), RouteDefinition("GET", "/{prefix}", "handle_models_page"),
) )

View File

@@ -1,5 +1,6 @@
from abc import ABC, abstractmethod from abc import ABC, abstractmethod
from typing import Dict, List, Optional, Type import asyncio
from typing import Dict, List, Optional, Type, TYPE_CHECKING
import logging import logging
import os import os
@@ -10,6 +11,9 @@ from .settings_manager import get_settings_manager
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
if TYPE_CHECKING:
from .model_update_service import ModelUpdateService
class BaseModelService(ABC): class BaseModelService(ABC):
"""Base service class for all model types""" """Base service class for all model types"""
@@ -23,6 +27,7 @@ class BaseModelService(ABC):
filter_set: Optional[ModelFilterSet] = None, filter_set: Optional[ModelFilterSet] = None,
search_strategy: Optional[SearchStrategy] = None, search_strategy: Optional[SearchStrategy] = None,
settings_provider: Optional[SettingsProvider] = None, settings_provider: Optional[SettingsProvider] = None,
update_service: Optional["ModelUpdateService"] = None,
): ):
"""Initialize the service. """Initialize the service.
@@ -34,6 +39,7 @@ class BaseModelService(ABC):
filter_set: Filter component controlling folder/tag/favorites logic. filter_set: Filter component controlling folder/tag/favorites logic.
search_strategy: Search component for fuzzy/text matching. search_strategy: Search component for fuzzy/text matching.
settings_provider: Settings object; defaults to the global settings manager. settings_provider: Settings object; defaults to the global settings manager.
update_service: Service used to determine whether models have remote updates available.
""" """
self.model_type = model_type self.model_type = model_type
self.scanner = scanner self.scanner = scanner
@@ -42,6 +48,7 @@ class BaseModelService(ABC):
self.cache_repository = cache_repository or ModelCacheRepository(scanner) self.cache_repository = cache_repository or ModelCacheRepository(scanner)
self.filter_set = filter_set or ModelFilterSet(self.settings) self.filter_set = filter_set or ModelFilterSet(self.settings)
self.search_strategy = search_strategy or SearchStrategy() self.search_strategy = search_strategy or SearchStrategy()
self.update_service = update_service
async def get_paginated_data( async def get_paginated_data(
self, self,
@@ -56,6 +63,7 @@ class BaseModelService(ABC):
search_options: dict = None, search_options: dict = None,
hash_filters: dict = None, hash_filters: dict = None,
favorites_only: bool = False, favorites_only: bool = False,
has_update: bool = False,
**kwargs, **kwargs,
) -> Dict: ) -> Dict:
"""Get paginated and filtered model data""" """Get paginated and filtered model data"""
@@ -85,6 +93,9 @@ class BaseModelService(ABC):
filtered_data = await self._apply_specific_filters(filtered_data, **kwargs) filtered_data = await self._apply_specific_filters(filtered_data, **kwargs)
if has_update:
filtered_data = await self._apply_update_filter(filtered_data)
return self._paginate(filtered_data, page, page_size) return self._paginate(filtered_data, page, page_size)
@@ -145,6 +156,59 @@ class BaseModelService(ABC):
"""Apply model-specific filters - to be overridden by subclasses if needed""" """Apply model-specific filters - to be overridden by subclasses if needed"""
return data return data
async def _apply_update_filter(self, data: List[Dict]) -> List[Dict]:
"""Filter models to those with remote updates available when requested."""
if not data:
return []
if self.update_service is None:
logger.warning(
"Requested has_update filter for %s models but update service is unavailable",
self.model_type,
)
return []
candidates: List[tuple[Dict, int]] = []
for item in data:
model_id = self._extract_model_id(item)
if model_id is not None:
candidates.append((item, model_id))
if not candidates:
return []
tasks = [
self.update_service.has_update(self.model_type, model_id)
for _, model_id in candidates
]
results = await asyncio.gather(*tasks, return_exceptions=True)
filtered: List[Dict] = []
for (item, model_id), result in zip(candidates, results):
if isinstance(result, Exception):
logger.error(
"Failed to resolve update status for model %s (%s): %s",
model_id,
self.model_type,
result,
)
continue
if result:
filtered.append(item)
return filtered
@staticmethod
def _extract_model_id(item: Dict) -> Optional[int]:
civitai = item.get('civitai') if isinstance(item, dict) else None
if not isinstance(civitai, dict):
return None
try:
value = civitai.get('modelId')
if value is None:
return None
return int(value)
except (TypeError, ValueError):
return None
def _paginate(self, data: List[Dict], page: int, page_size: int) -> Dict: def _paginate(self, data: List[Dict], page: int, page_size: int) -> Dict:
"""Apply pagination to filtered data""" """Apply pagination to filtered data"""
total_items = len(data) total_items = len(data)

View File

@@ -11,13 +11,14 @@ logger = logging.getLogger(__name__)
class CheckpointService(BaseModelService): class CheckpointService(BaseModelService):
"""Checkpoint-specific service implementation""" """Checkpoint-specific service implementation"""
def __init__(self, scanner): def __init__(self, scanner, update_service=None):
"""Initialize Checkpoint service """Initialize Checkpoint service
Args: Args:
scanner: Checkpoint scanner instance scanner: Checkpoint scanner instance
update_service: Optional service for remote update tracking.
""" """
super().__init__("checkpoint", scanner, CheckpointMetadata) super().__init__("checkpoint", scanner, CheckpointMetadata, update_service=update_service)
async def format_response(self, checkpoint_data: Dict) -> Dict: async def format_response(self, checkpoint_data: Dict) -> Dict:
"""Format Checkpoint data for API response""" """Format Checkpoint data for API response"""

View File

@@ -0,0 +1,554 @@
import os
import json
import logging
import asyncio
from copy import deepcopy
from typing import Optional, Dict, Tuple, List
from .model_metadata_provider import CivArchiveModelMetadataProvider, ModelMetadataProviderManager
from .downloader import get_downloader
from .errors import RateLimitError
try:
from bs4 import BeautifulSoup
except ImportError as exc:
BeautifulSoup = None # type: ignore[assignment]
_BS4_IMPORT_ERROR = exc
else:
_BS4_IMPORT_ERROR = None
def _require_beautifulsoup():
if BeautifulSoup is None:
raise RuntimeError(
"BeautifulSoup (bs4) is required for CivArchive client. "
"Install it with 'pip install beautifulsoup4'."
) from _BS4_IMPORT_ERROR
return BeautifulSoup
logger = logging.getLogger(__name__)
class CivArchiveClient:
_instance = None
_lock = asyncio.Lock()
@classmethod
async def get_instance(cls):
"""Get singleton instance of CivArchiveClient"""
async with cls._lock:
if cls._instance is None:
cls._instance = cls()
# Register this client as a metadata provider
provider_manager = await ModelMetadataProviderManager.get_instance()
provider_manager.register_provider('civarchive', CivArchiveModelMetadataProvider(cls._instance), False)
return cls._instance
def __init__(self):
# Check if already initialized for singleton pattern
if hasattr(self, '_initialized'):
return
self._initialized = True
self.base_url = "https://civarchive.com/api"
async def _request_json(
self,
path: str,
params: Optional[Dict[str, str]] = None
) -> Tuple[Optional[Dict], Optional[str]]:
"""Call CivArchive API and return JSON payload"""
success, payload = await self._make_request(path, params=params)
if not success:
error = payload if isinstance(payload, str) else "Request failed"
return None, error
if not isinstance(payload, dict):
return None, "Invalid response structure"
return payload, None
async def _make_request(
self,
path: str,
*,
params: Optional[Dict[str, str]] = None,
) -> Tuple[bool, Dict | str]:
"""Wrapper around downloader.make_request that surfaces rate limits."""
downloader = await get_downloader()
kwargs: Dict[str, Dict[str, str]] = {}
if params:
safe_params = {str(key): str(value) for key, value in params.items() if value is not None}
if safe_params:
kwargs["params"] = safe_params
success, payload = await downloader.make_request(
"GET",
f"{self.base_url}{path}",
use_auth=False,
**kwargs,
)
if not success and isinstance(payload, RateLimitError):
if payload.provider is None:
payload.provider = "civarchive_api"
raise payload
return success, payload
@staticmethod
def _normalize_payload(payload: Dict) -> Dict:
"""Unwrap CivArchive responses that wrap content under a data key"""
if not isinstance(payload, dict):
return {}
data = payload.get("data")
if isinstance(data, dict):
return data
return payload
@staticmethod
def _split_context(payload: Dict) -> Tuple[Dict, Dict, List[Dict]]:
"""Separate version payload from surrounding model context"""
data = CivArchiveClient._normalize_payload(payload)
context: Dict = {}
fallback_files: List[Dict] = []
version: Dict = {}
for key, value in data.items():
if key in {"version", "model"}:
continue
context[key] = value
if isinstance(data.get("version"), dict):
version = data["version"]
model_block = data.get("model")
if isinstance(model_block, dict):
for key, value in model_block.items():
if key == "version":
if not version and isinstance(value, dict):
version = value
continue
context.setdefault(key, value)
fallback_files = fallback_files or model_block.get("files") or []
fallback_files = fallback_files or data.get("files") or []
return context, version, fallback_files
@staticmethod
def _ensure_list(value) -> List:
if isinstance(value, list):
return value
if value is None:
return []
return [value]
@staticmethod
def _build_model_info(context: Dict) -> Dict:
tags = context.get("tags")
if not isinstance(tags, list):
tags = list(tags) if isinstance(tags, (set, tuple)) else ([] if tags is None else [tags])
return {
"name": context.get("name"),
"type": context.get("type"),
"nsfw": bool(context.get("is_nsfw", context.get("nsfw", False))),
"description": context.get("description"),
"tags": tags,
}
@staticmethod
def _build_creator_info(context: Dict) -> Dict:
username = context.get("creator_username") or context.get("username") or ""
image = context.get("creator_image") or context.get("creator_avatar") or ""
creator: Dict[str, Optional[str]] = {
"username": username,
"image": image,
}
if context.get("creator_name"):
creator["name"] = context["creator_name"]
if context.get("creator_url"):
creator["url"] = context["creator_url"]
return creator
@staticmethod
def _transform_file_entry(file_data: Dict) -> Dict:
mirrors = file_data.get("mirrors") or []
if not isinstance(mirrors, list):
mirrors = [mirrors]
available_mirror = next(
(mirror for mirror in mirrors if isinstance(mirror, dict) and mirror.get("deletedAt") is None),
None
)
download_url = file_data.get("downloadUrl")
if not download_url and available_mirror:
download_url = available_mirror.get("url")
name = file_data.get("name")
if not name and available_mirror:
name = available_mirror.get("filename")
transformed: Dict = {
"id": file_data.get("id"),
"sizeKB": file_data.get("sizeKB"),
"name": name,
"type": file_data.get("type"),
"downloadUrl": download_url,
"primary": True,
# TODO: for some reason is_primary is false in CivArchive response, need to figure this out,
# "primary": bool(file_data.get("is_primary", file_data.get("primary", False))),
"mirrors": mirrors,
}
sha256 = file_data.get("sha256")
if sha256:
transformed["hashes"] = {"SHA256": str(sha256).upper()}
elif isinstance(file_data.get("hashes"), dict):
transformed["hashes"] = file_data["hashes"]
if "metadata" in file_data:
transformed["metadata"] = file_data["metadata"]
if file_data.get("modelVersionId") is not None:
transformed["modelVersionId"] = file_data.get("modelVersionId")
elif file_data.get("model_version_id") is not None:
transformed["modelVersionId"] = file_data.get("model_version_id")
if file_data.get("modelId") is not None:
transformed["modelId"] = file_data.get("modelId")
elif file_data.get("model_id") is not None:
transformed["modelId"] = file_data.get("model_id")
return transformed
def _transform_files(
self,
files: Optional[List[Dict]],
fallback_files: Optional[List[Dict]] = None
) -> List[Dict]:
candidates: List[Dict] = []
if isinstance(files, list) and files:
candidates = files
elif isinstance(fallback_files, list):
candidates = fallback_files
transformed_files: List[Dict] = []
for file_data in candidates:
if isinstance(file_data, dict):
transformed_files.append(self._transform_file_entry(file_data))
return transformed_files
def _transform_version(
self,
context: Dict,
version: Dict,
fallback_files: Optional[List[Dict]] = None
) -> Optional[Dict]:
if not version:
return None
version_copy = deepcopy(version)
version_copy.pop("model", None)
version_copy.pop("creator", None)
if "trigger" in version_copy:
triggers = version_copy.pop("trigger")
if isinstance(triggers, list):
version_copy["trainedWords"] = triggers
elif triggers is None:
version_copy["trainedWords"] = []
else:
version_copy["trainedWords"] = [triggers]
if "trainedWords" in version_copy and isinstance(version_copy["trainedWords"], str):
version_copy["trainedWords"] = [version_copy["trainedWords"]]
if "nsfw_level" in version_copy:
version_copy["nsfwLevel"] = version_copy.pop("nsfw_level")
elif "nsfwLevel" not in version_copy and context.get("nsfw_level") is not None:
version_copy["nsfwLevel"] = context.get("nsfw_level")
stats_keys = ["downloadCount", "ratingCount", "rating"]
stats = {key: version_copy.pop(key) for key in stats_keys if key in version_copy}
if stats:
version_copy["stats"] = stats
version_copy["files"] = self._transform_files(version_copy.get("files"), fallback_files)
version_copy["images"] = self._ensure_list(version_copy.get("images"))
version_copy["model"] = self._build_model_info(context)
version_copy["creator"] = self._build_creator_info(context)
version_copy["source"] = "civarchive"
version_copy["is_deleted"] = bool(context.get("deletedAt")) or bool(version.get("deletedAt"))
return version_copy
async def _resolve_version_from_files(self, payload: Dict) -> Optional[Dict]:
"""Fallback to fetch version data when only file metadata is available"""
data = self._normalize_payload(payload)
files = data.get("files") or payload.get("files") or []
if not isinstance(files, list):
files = [files]
for file_data in files:
if not isinstance(file_data, dict):
continue
model_id = file_data.get("model_id") or file_data.get("modelId")
version_id = file_data.get("model_version_id") or file_data.get("modelVersionId")
if model_id is None or version_id is None:
continue
resolved = await self.get_model_version(model_id, version_id)
if resolved:
return resolved
return None
async def get_model_by_hash(self, model_hash: str) -> Tuple[Optional[Dict], Optional[str]]:
"""Find model by SHA256 hash value using CivArchive API"""
try:
payload, error = await self._request_json(f"/sha256/{model_hash.lower()}")
if error:
if "not found" in error.lower():
return None, "Model not found"
return None, error
context, version_data, fallback_files = self._split_context(payload)
transformed = self._transform_version(context, version_data, fallback_files)
if transformed:
return transformed, None
resolved = await self._resolve_version_from_files(payload)
if resolved:
return resolved, None
logger.error("Error fetching version of CivArchive model by hash %s", model_hash[:10])
return None, "No version data found"
except RateLimitError:
raise
except Exception as e:
logger.error(f"Error fetching CivArchive model by hash {model_hash[:10]}: {e}")
return None, str(e)
async def get_model_versions(self, model_id: str) -> Optional[Dict]:
"""Get all versions of a model using CivArchive API"""
try:
payload, error = await self._request_json(f"/models/{model_id}")
if error or payload is None:
if error and "not found" in error.lower():
return None
logger.error(f"Error fetching CivArchive model versions for {model_id}: {error}")
return None
data = self._normalize_payload(payload)
context, version_data, fallback_files = self._split_context(payload)
versions_meta = data.get("versions") or []
transformed_versions: List[Dict] = []
for meta in versions_meta:
if not isinstance(meta, dict):
continue
version_id = meta.get("id")
if version_id is None:
continue
target_model_id = meta.get("modelId") or model_id
version = await self.get_model_version(target_model_id, version_id)
if version:
transformed_versions.append(version)
# Ensure the primary version is included even if versions list was empty
primary_version = self._transform_version(context, version_data, fallback_files)
if primary_version:
transformed_versions.insert(0, primary_version)
ordered_versions: List[Dict] = []
seen_ids = set()
for version in transformed_versions:
version_id = version.get("id")
if version_id in seen_ids:
continue
seen_ids.add(version_id)
ordered_versions.append(version)
return {
"modelVersions": ordered_versions,
"type": context.get("type", ""),
"name": context.get("name", ""),
}
except RateLimitError:
raise
except Exception as e:
logger.error(f"Error fetching CivArchive model versions for {model_id}: {e}")
return None
async def get_model_version(self, model_id: int = None, version_id: int = None) -> Optional[Dict]:
"""Get specific model version using CivArchive API
Args:
model_id: The model ID (required)
version_id: Optional specific version ID to filter to
Returns:
Optional[Dict]: The model version data or None if not found
"""
if model_id is None:
return None
try:
params = {"modelVersionId": version_id} if version_id is not None else None
payload, error = await self._request_json(f"/models/{model_id}", params=params)
if error or payload is None:
if error and "not found" in error.lower():
return None
logger.error(f"Error fetching CivArchive model version via API {model_id}/{version_id}: {error}")
return None
context, version_data, fallback_files = self._split_context(payload)
if not version_data:
return await self._resolve_version_from_files(payload)
if version_id is not None:
raw_id = version_data.get("id")
if raw_id != version_id:
logger.warning(
"Requested version %s doesn't match default version %s for model %s",
version_id,
raw_id,
model_id,
)
return None
actual_model_id = version_data.get("modelId")
context_model_id = context.get("id")
# CivArchive can respond with data for a different model id while already
# returning the fully resolved model context. Only follow the redirect when
# the context itself still points to the original (wrong) model.
if (
actual_model_id is not None
and str(actual_model_id) != str(model_id)
and (context_model_id is None or str(context_model_id) != str(actual_model_id))
):
return await self.get_model_version(actual_model_id, version_id)
return self._transform_version(context, version_data, fallback_files)
except RateLimitError:
raise
except Exception as e:
logger.error(f"Error fetching CivArchive model version via API {model_id}/{version_id}: {e}")
return None
async def get_model_version_info(self, version_id: str) -> Tuple[Optional[Dict], Optional[str]]:
""" Fetch model version metadata using a known bogus model lookup
CivArchive lacks a direct version lookup API, this uses a workaround (which we handle in the main model request now)
Args:
version_id: The model version ID
Returns:
Tuple[Optional[Dict], Optional[str]]: (version_data, error_message)
"""
version = await self.get_model_version(1, version_id)
if version is None:
return None, "Model not found"
return version, None
async def get_model_by_url(self, url) -> Optional[Dict]:
"""Get specific model version by parsing CivArchive HTML page (legacy method)
This is the original HTML scraping implementation, kept for reference and new sites added not in api.
The primary get_model_version() now uses the API instead.
"""
try:
# Construct CivArchive URL
url = f"https://civarchive.com/{url}"
downloader = await get_downloader()
session = await downloader.session
async with session.get(url) as response:
if response.status != 200:
return None
html_content = await response.text()
# Parse HTML to extract JSON data
soup_parser = _require_beautifulsoup()
soup = soup_parser(html_content, 'html.parser')
script_tag = soup.find('script', {'id': '__NEXT_DATA__', 'type': 'application/json'})
if not script_tag:
return None
# Parse JSON content
json_data = json.loads(script_tag.string)
model_data = json_data.get('props', {}).get('pageProps', {}).get('model')
if not model_data or 'version' not in model_data:
return None
# Extract version data as base
version = model_data['version'].copy()
# Restructure stats
if 'downloadCount' in version and 'ratingCount' in version and 'rating' in version:
version['stats'] = {
'downloadCount': version.pop('downloadCount'),
'ratingCount': version.pop('ratingCount'),
'rating': version.pop('rating')
}
# Rename trigger to trainedWords
if 'trigger' in version:
version['trainedWords'] = version.pop('trigger')
# Transform files data to expected format
if 'files' in version:
transformed_files = []
for file_data in version['files']:
# Find first available mirror (deletedAt is null)
available_mirror = None
for mirror in file_data.get('mirrors', []):
if mirror.get('deletedAt') is None:
available_mirror = mirror
break
# Create transformed file entry
transformed_file = {
'id': file_data.get('id'),
'sizeKB': file_data.get('sizeKB'),
'name': available_mirror.get('filename', file_data.get('name')) if available_mirror else file_data.get('name'),
'type': file_data.get('type'),
'downloadUrl': available_mirror.get('url') if available_mirror else None,
'primary': file_data.get('is_primary', False),
'mirrors': file_data.get('mirrors', [])
}
# Transform hash format
if 'sha256' in file_data:
transformed_file['hashes'] = {
'SHA256': file_data['sha256'].upper()
}
transformed_files.append(transformed_file)
version['files'] = transformed_files
# Add model information
version['model'] = {
'name': model_data.get('name'),
'type': model_data.get('type'),
'nsfw': model_data.get('is_nsfw', False),
'description': model_data.get('description'),
'tags': model_data.get('tags', [])
}
version['creator'] = {
'username': model_data.get('username'),
'image': ''
}
# Add source identifier
version['source'] = 'civarchive'
version['is_deleted'] = json_data.get('query', {}).get('is_deleted', False)
return version
except RateLimitError:
raise
except Exception as e:
logger.error(f"Error fetching CivArchive model version (scraping) {url}: {e}")
return None

View File

@@ -5,6 +5,7 @@ import os
from typing import Optional, Dict, Tuple, List from typing import Optional, Dict, Tuple, List
from .model_metadata_provider import CivitaiModelMetadataProvider, ModelMetadataProviderManager from .model_metadata_provider import CivitaiModelMetadataProvider, ModelMetadataProviderManager
from .downloader import get_downloader from .downloader import get_downloader
from .errors import RateLimitError
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -33,6 +34,29 @@ class CivitaiClient:
self.base_url = "https://civitai.com/api/v1" self.base_url = "https://civitai.com/api/v1"
async def _make_request(
self,
method: str,
url: str,
*,
use_auth: bool = False,
**kwargs,
) -> Tuple[bool, Dict | str]:
"""Wrapper around downloader.make_request that surfaces rate limits."""
downloader = await get_downloader()
success, result = await downloader.make_request(
method,
url,
use_auth=use_auth,
**kwargs,
)
if not success and isinstance(result, RateLimitError):
if result.provider is None:
result.provider = "civitai_api"
raise result
return success, result
@staticmethod @staticmethod
def _remove_comfy_metadata(model_version: Optional[Dict]) -> None: def _remove_comfy_metadata(model_version: Optional[Dict]) -> None:
"""Remove Comfy-specific metadata from model version images.""" """Remove Comfy-specific metadata from model version images."""
@@ -79,8 +103,7 @@ class CivitaiClient:
async def get_model_by_hash(self, model_hash: str) -> Tuple[Optional[Dict], Optional[str]]: async def get_model_by_hash(self, model_hash: str) -> Tuple[Optional[Dict], Optional[str]]:
try: try:
downloader = await get_downloader() success, result = await self._make_request(
success, result = await downloader.make_request(
'GET', 'GET',
f"{self.base_url}/model-versions/by-hash/{model_hash}", f"{self.base_url}/model-versions/by-hash/{model_hash}",
use_auth=True use_auth=True
@@ -90,7 +113,7 @@ class CivitaiClient:
model_id = result.get('modelId') model_id = result.get('modelId')
if model_id: if model_id:
# Fetch additional model metadata # Fetch additional model metadata
success_model, data = await downloader.make_request( success_model, data = await self._make_request(
'GET', 'GET',
f"{self.base_url}/models/{model_id}", f"{self.base_url}/models/{model_id}",
use_auth=True use_auth=True
@@ -113,6 +136,8 @@ class CivitaiClient:
# Other error cases # Other error cases
logger.error(f"Failed to fetch model info for {model_hash[:10]}: {result}") logger.error(f"Failed to fetch model info for {model_hash[:10]}: {result}")
return None, str(result) return None, str(result)
except RateLimitError:
raise
except Exception as e: except Exception as e:
logger.error(f"API Error: {str(e)}") logger.error(f"API Error: {str(e)}")
return None, str(e) return None, str(e)
@@ -138,8 +163,7 @@ class CivitaiClient:
async def get_model_versions(self, model_id: str) -> List[Dict]: async def get_model_versions(self, model_id: str) -> List[Dict]:
"""Get all versions of a model with local availability info""" """Get all versions of a model with local availability info"""
try: try:
downloader = await get_downloader() success, result = await self._make_request(
success, result = await downloader.make_request(
'GET', 'GET',
f"{self.base_url}/models/{model_id}", f"{self.base_url}/models/{model_id}",
use_auth=True use_auth=True
@@ -152,6 +176,8 @@ class CivitaiClient:
'name': result.get('name', '') 'name': result.get('name', '')
} }
return None return None
except RateLimitError:
raise
except Exception as e: except Exception as e:
logger.error(f"Error fetching model versions: {e}") logger.error(f"Error fetching model versions: {e}")
return None return None
@@ -159,23 +185,23 @@ class CivitaiClient:
async def get_model_version(self, model_id: int = None, version_id: int = None) -> Optional[Dict]: async def get_model_version(self, model_id: int = None, version_id: int = None) -> Optional[Dict]:
"""Get specific model version with additional metadata.""" """Get specific model version with additional metadata."""
try: try:
downloader = await get_downloader()
if model_id is None and version_id is not None: if model_id is None and version_id is not None:
return await self._get_version_by_id_only(downloader, version_id) return await self._get_version_by_id_only(version_id)
if model_id is not None: if model_id is not None:
return await self._get_version_with_model_id(downloader, model_id, version_id) return await self._get_version_with_model_id(model_id, version_id)
logger.error("Either model_id or version_id must be provided") logger.error("Either model_id or version_id must be provided")
return None return None
except RateLimitError:
raise
except Exception as e: except Exception as e:
logger.error(f"Error fetching model version: {e}") logger.error(f"Error fetching model version: {e}")
return None return None
async def _get_version_by_id_only(self, downloader, version_id: int) -> Optional[Dict]: async def _get_version_by_id_only(self, version_id: int) -> Optional[Dict]:
version = await self._fetch_version_by_id(downloader, version_id) version = await self._fetch_version_by_id(version_id)
if version is None: if version is None:
return None return None
@@ -184,15 +210,15 @@ class CivitaiClient:
logger.error(f"No modelId found in version {version_id}") logger.error(f"No modelId found in version {version_id}")
return None return None
model_data = await self._fetch_model_data(downloader, model_id) model_data = await self._fetch_model_data(model_id)
if model_data: if model_data:
self._enrich_version_with_model_data(version, model_data) self._enrich_version_with_model_data(version, model_data)
self._remove_comfy_metadata(version) self._remove_comfy_metadata(version)
return version return version
async def _get_version_with_model_id(self, downloader, model_id: int, version_id: Optional[int]) -> Optional[Dict]: async def _get_version_with_model_id(self, model_id: int, version_id: Optional[int]) -> Optional[Dict]:
model_data = await self._fetch_model_data(downloader, model_id) model_data = await self._fetch_model_data(model_id)
if not model_data: if not model_data:
return None return None
@@ -201,12 +227,12 @@ class CivitaiClient:
return None return None
target_version_id = target_version.get('id') target_version_id = target_version.get('id')
version = await self._fetch_version_by_id(downloader, target_version_id) if target_version_id else None version = await self._fetch_version_by_id(target_version_id) if target_version_id else None
if version is None: if version is None:
model_hash = self._extract_primary_model_hash(target_version) model_hash = self._extract_primary_model_hash(target_version)
if model_hash: if model_hash:
version = await self._fetch_version_by_hash(downloader, model_hash) version = await self._fetch_version_by_hash(model_hash)
else: else:
logger.warning( logger.warning(
f"No primary model hash found for model {model_id} version {target_version_id}" f"No primary model hash found for model {model_id} version {target_version_id}"
@@ -219,8 +245,8 @@ class CivitaiClient:
self._remove_comfy_metadata(version) self._remove_comfy_metadata(version)
return version return version
async def _fetch_model_data(self, downloader, model_id: int) -> Optional[Dict]: async def _fetch_model_data(self, model_id: int) -> Optional[Dict]:
success, data = await downloader.make_request( success, data = await self._make_request(
'GET', 'GET',
f"{self.base_url}/models/{model_id}", f"{self.base_url}/models/{model_id}",
use_auth=True use_auth=True
@@ -230,11 +256,11 @@ class CivitaiClient:
logger.warning(f"Failed to fetch model data for model {model_id}") logger.warning(f"Failed to fetch model data for model {model_id}")
return None return None
async def _fetch_version_by_id(self, downloader, version_id: Optional[int]) -> Optional[Dict]: async def _fetch_version_by_id(self, version_id: Optional[int]) -> Optional[Dict]:
if version_id is None: if version_id is None:
return None return None
success, version = await downloader.make_request( success, version = await self._make_request(
'GET', 'GET',
f"{self.base_url}/model-versions/{version_id}", f"{self.base_url}/model-versions/{version_id}",
use_auth=True use_auth=True
@@ -245,11 +271,11 @@ class CivitaiClient:
logger.warning(f"Failed to fetch version by id {version_id}") logger.warning(f"Failed to fetch version by id {version_id}")
return None return None
async def _fetch_version_by_hash(self, downloader, model_hash: Optional[str]) -> Optional[Dict]: async def _fetch_version_by_hash(self, model_hash: Optional[str]) -> Optional[Dict]:
if not model_hash: if not model_hash:
return None return None
success, version = await downloader.make_request( success, version = await self._make_request(
'GET', 'GET',
f"{self.base_url}/model-versions/by-hash/{model_hash}", f"{self.base_url}/model-versions/by-hash/{model_hash}",
use_auth=True use_auth=True
@@ -323,11 +349,10 @@ class CivitaiClient:
- An error message if there was an error, or None on success - An error message if there was an error, or None on success
""" """
try: try:
downloader = await get_downloader()
url = f"{self.base_url}/model-versions/{version_id}" url = f"{self.base_url}/model-versions/{version_id}"
logger.debug(f"Resolving DNS for model version info: {url}") logger.debug(f"Resolving DNS for model version info: {url}")
success, result = await downloader.make_request( success, result = await self._make_request(
'GET', 'GET',
url, url,
use_auth=True use_auth=True
@@ -347,6 +372,8 @@ class CivitaiClient:
# Other error cases # Other error cases
logger.error(f"Failed to fetch model info for {version_id}: {result}") logger.error(f"Failed to fetch model info for {version_id}: {result}")
return None, str(result) return None, str(result)
except RateLimitError:
raise
except Exception as e: except Exception as e:
error_msg = f"Error fetching model version info: {e}" error_msg = f"Error fetching model version info: {e}"
logger.error(error_msg) logger.error(error_msg)
@@ -362,11 +389,10 @@ class CivitaiClient:
Optional[Dict]: The image data or None if not found Optional[Dict]: The image data or None if not found
""" """
try: try:
downloader = await get_downloader()
url = f"{self.base_url}/images?imageId={image_id}&nsfw=X" url = f"{self.base_url}/images?imageId={image_id}&nsfw=X"
logger.debug(f"Fetching image info for ID: {image_id}") logger.debug(f"Fetching image info for ID: {image_id}")
success, result = await downloader.make_request( success, result = await self._make_request(
'GET', 'GET',
url, url,
use_auth=True use_auth=True
@@ -381,6 +407,8 @@ class CivitaiClient:
logger.error(f"Failed to fetch image info for ID: {image_id}: {result}") logger.error(f"Failed to fetch image info for ID: {image_id}: {result}")
return None return None
except RateLimitError:
raise
except Exception as e: except Exception as e:
error_msg = f"Error fetching image info: {e}" error_msg = f"Error fetching image info: {e}"
logger.error(error_msg) logger.error(error_msg)
@@ -392,9 +420,8 @@ class CivitaiClient:
return None return None
try: try:
downloader = await get_downloader()
url = f"{self.base_url}/models?username={username}" url = f"{self.base_url}/models?username={username}"
success, result = await downloader.make_request( success, result = await self._make_request(
'GET', 'GET',
url, url,
use_auth=True use_auth=True
@@ -416,6 +443,8 @@ class CivitaiClient:
self._remove_comfy_metadata(version) self._remove_comfy_metadata(version)
return items return items
except RateLimitError:
raise
except Exception as exc: # pragma: no cover - defensive logging except Exception as exc: # pragma: no cover - defensive logging
logger.error("Error fetching models for %s: %s", username, exc) logger.error("Error fetching models for %s: %s", username, exc)
return None return None

View File

@@ -5,6 +5,8 @@ from __future__ import annotations
import logging import logging
from typing import Any, Awaitable, Callable, Dict, Optional from typing import Any, Awaitable, Callable, Dict, Optional
from .downloader import DownloadProgress
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -29,14 +31,40 @@ class DownloadCoordinator:
download_id = payload.get("download_id") or self._ws_manager.generate_download_id() download_id = payload.get("download_id") or self._ws_manager.generate_download_id()
payload.setdefault("download_id", download_id) payload.setdefault("download_id", download_id)
async def progress_callback(progress: Any) -> None: async def progress_callback(progress: Any, snapshot: Optional[DownloadProgress] = None) -> None:
percent = 0.0
metrics: Optional[DownloadProgress] = None
if isinstance(progress, DownloadProgress):
metrics = progress
percent = progress.percent_complete
elif isinstance(snapshot, DownloadProgress):
metrics = snapshot
percent = snapshot.percent_complete
else:
try:
percent = float(progress)
except (TypeError, ValueError):
percent = 0.0
payload: Dict[str, Any] = {
"status": "progress",
"progress": round(percent),
"download_id": download_id,
}
if metrics is not None:
payload.update(
{
"bytes_downloaded": metrics.bytes_downloaded,
"total_bytes": metrics.total_bytes,
"bytes_per_second": metrics.bytes_per_second,
}
)
await self._ws_manager.broadcast_download_progress( await self._ws_manager.broadcast_download_progress(
download_id, download_id,
{ payload,
"status": "progress",
"progress": progress,
"download_id": download_id,
},
) )
model_id = self._parse_optional_int(payload.get("model_id"), "model_id") model_id = self._parse_optional_int(payload.get("model_id"), "model_id")
@@ -81,6 +109,56 @@ class DownloadCoordinator:
return result return result
async def pause_download(self, download_id: str) -> Dict[str, Any]:
"""Pause an active download and notify listeners."""
download_manager = await self._download_manager_factory()
result = await download_manager.pause_download(download_id)
if result.get("success"):
cached_progress = self._ws_manager.get_download_progress(download_id) or {}
payload: Dict[str, Any] = {
"status": "paused",
"progress": cached_progress.get("progress", 0),
"download_id": download_id,
"message": "Download paused by user",
}
for field in ("bytes_downloaded", "total_bytes", "bytes_per_second"):
if field in cached_progress:
payload[field] = cached_progress[field]
payload["bytes_per_second"] = 0.0
await self._ws_manager.broadcast_download_progress(download_id, payload)
return result
async def resume_download(self, download_id: str) -> Dict[str, Any]:
"""Resume a paused download and notify listeners."""
download_manager = await self._download_manager_factory()
result = await download_manager.resume_download(download_id)
if result.get("success"):
cached_progress = self._ws_manager.get_download_progress(download_id) or {}
payload: Dict[str, Any] = {
"status": "downloading",
"progress": cached_progress.get("progress", 0),
"download_id": download_id,
"message": "Download resumed by user",
}
for field in ("bytes_downloaded", "total_bytes"):
if field in cached_progress:
payload[field] = cached_progress[field]
payload["bytes_per_second"] = cached_progress.get("bytes_per_second", 0.0)
await self._ws_manager.broadcast_download_progress(download_id, payload)
return result
async def list_active_downloads(self) -> Dict[str, Any]: async def list_active_downloads(self) -> Dict[str, Any]:
"""Return the active download map from the underlying manager.""" """Return the active download map from the underlying manager."""

View File

@@ -1,19 +1,20 @@
import logging import logging
import os import os
import asyncio import asyncio
import inspect
from collections import OrderedDict from collections import OrderedDict
import uuid import uuid
from typing import Dict, List from typing import Dict, List, Optional, Tuple
from urllib.parse import urlparse from urllib.parse import urlparse
from ..utils.models import LoraMetadata, CheckpointMetadata, EmbeddingMetadata from ..utils.models import LoraMetadata, CheckpointMetadata, EmbeddingMetadata
from ..utils.constants import CARD_PREVIEW_WIDTH, VALID_LORA_TYPES, CIVITAI_MODEL_TAGS from ..utils.constants import CARD_PREVIEW_WIDTH, VALID_LORA_TYPES
from ..utils.civitai_utils import rewrite_preview_url from ..utils.civitai_utils import rewrite_preview_url
from ..utils.exif_utils import ExifUtils from ..utils.exif_utils import ExifUtils
from ..utils.metadata_manager import MetadataManager from ..utils.metadata_manager import MetadataManager
from .service_registry import ServiceRegistry from .service_registry import ServiceRegistry
from .settings_manager import get_settings_manager from .settings_manager import get_settings_manager
from .metadata_service import get_default_metadata_provider from .metadata_service import get_default_metadata_provider
from .downloader import get_downloader from .downloader import get_downloader, DownloadProgress
# Download to temporary file first # Download to temporary file first
import tempfile import tempfile
@@ -42,6 +43,7 @@ class DownloadManager:
self._active_downloads = OrderedDict() # download_id -> download_info self._active_downloads = OrderedDict() # download_id -> download_info
self._download_semaphore = asyncio.Semaphore(5) # Limit concurrent downloads self._download_semaphore = asyncio.Semaphore(5) # Limit concurrent downloads
self._download_tasks = {} # download_id -> asyncio.Task self._download_tasks = {} # download_id -> asyncio.Task
self._pause_events: Dict[str, asyncio.Event] = {}
async def _get_lora_scanner(self): async def _get_lora_scanner(self):
"""Get the lora scanner from registry""" """Get the lora scanner from registry"""
@@ -82,9 +84,16 @@ class DownloadManager:
'model_id': model_id, 'model_id': model_id,
'model_version_id': model_version_id, 'model_version_id': model_version_id,
'progress': 0, 'progress': 0,
'status': 'queued' 'status': 'queued',
'bytes_downloaded': 0,
'total_bytes': None,
'bytes_per_second': 0.0,
} }
pause_event = asyncio.Event()
pause_event.set()
self._pause_events[task_id] = pause_event
# Create tracking task # Create tracking task
download_task = asyncio.create_task( download_task = asyncio.create_task(
self._download_with_semaphore( self._download_with_semaphore(
@@ -107,6 +116,7 @@ class DownloadManager:
# Clean up task reference # Clean up task reference
if task_id in self._download_tasks: if task_id in self._download_tasks:
del self._download_tasks[task_id] del self._download_tasks[task_id]
self._pause_events.pop(task_id, None)
async def _download_with_semaphore(self, task_id: str, model_id: int, model_version_id: int, async def _download_with_semaphore(self, task_id: str, model_id: int, model_version_id: int,
save_dir: str, relative_path: str, save_dir: str, relative_path: str,
@@ -119,15 +129,30 @@ class DownloadManager:
# Wrap progress callback to track progress in active_downloads # Wrap progress callback to track progress in active_downloads
original_callback = progress_callback original_callback = progress_callback
async def tracking_callback(progress): async def tracking_callback(progress, metrics=None):
progress_value, snapshot = self._normalize_progress(progress, metrics)
if task_id in self._active_downloads: if task_id in self._active_downloads:
self._active_downloads[task_id]['progress'] = progress info = self._active_downloads[task_id]
info['progress'] = round(progress_value)
if snapshot is not None:
info['bytes_downloaded'] = snapshot.bytes_downloaded
info['total_bytes'] = snapshot.total_bytes
info['bytes_per_second'] = snapshot.bytes_per_second
if original_callback: if original_callback:
await original_callback(progress) await self._dispatch_progress(original_callback, snapshot, progress_value)
# Acquire semaphore to limit concurrent downloads # Acquire semaphore to limit concurrent downloads
try: try:
async with self._download_semaphore: async with self._download_semaphore:
pause_event = self._pause_events.get(task_id)
if pause_event is not None and not pause_event.is_set():
if task_id in self._active_downloads:
self._active_downloads[task_id]['status'] = 'paused'
self._active_downloads[task_id]['bytes_per_second'] = 0.0
await pause_event.wait()
# Update status to downloading # Update status to downloading
if task_id in self._active_downloads: if task_id in self._active_downloads:
self._active_downloads[task_id]['status'] = 'downloading' self._active_downloads[task_id]['status'] = 'downloading'
@@ -149,12 +174,14 @@ class DownloadManager:
self._active_downloads[task_id]['status'] = 'completed' if result['success'] else 'failed' self._active_downloads[task_id]['status'] = 'completed' if result['success'] else 'failed'
if not result['success']: if not result['success']:
self._active_downloads[task_id]['error'] = result.get('error', 'Unknown error') self._active_downloads[task_id]['error'] = result.get('error', 'Unknown error')
self._active_downloads[task_id]['bytes_per_second'] = 0.0
return result return result
except asyncio.CancelledError: except asyncio.CancelledError:
# Handle cancellation # Handle cancellation
if task_id in self._active_downloads: if task_id in self._active_downloads:
self._active_downloads[task_id]['status'] = 'cancelled' self._active_downloads[task_id]['status'] = 'cancelled'
self._active_downloads[task_id]['bytes_per_second'] = 0.0
logger.info(f"Download cancelled for task {task_id}") logger.info(f"Download cancelled for task {task_id}")
raise raise
except Exception as e: except Exception as e:
@@ -163,6 +190,7 @@ class DownloadManager:
if task_id in self._active_downloads: if task_id in self._active_downloads:
self._active_downloads[task_id]['status'] = 'failed' self._active_downloads[task_id]['status'] = 'failed'
self._active_downloads[task_id]['error'] = str(e) self._active_downloads[task_id]['error'] = str(e)
self._active_downloads[task_id]['bytes_per_second'] = 0.0
return {'success': False, 'error': str(e)} return {'success': False, 'error': str(e)}
finally: finally:
# Schedule cleanup of download record after delay # Schedule cleanup of download record after delay
@@ -174,9 +202,17 @@ class DownloadManager:
if task_id in self._active_downloads: if task_id in self._active_downloads:
del self._active_downloads[task_id] del self._active_downloads[task_id]
async def _execute_original_download(self, model_id, model_version_id, save_dir, async def _execute_original_download(
relative_path, progress_callback, use_default_paths, self,
download_id=None, source=None): model_id,
model_version_id,
save_dir,
relative_path,
progress_callback,
use_default_paths,
download_id=None,
source=None,
):
"""Wrapper for original download_from_civitai implementation""" """Wrapper for original download_from_civitai implementation"""
try: try:
# Check if model version already exists in library # Check if model version already exists in library
@@ -198,12 +234,7 @@ class DownloadManager:
if await embedding_scanner.check_model_version_exists(model_version_id): if await embedding_scanner.check_model_version_exists(model_version_id):
return {'success': False, 'error': 'Model version already exists in embedding library'} return {'success': False, 'error': 'Model version already exists in embedding library'}
# Get metadata provider based on source parameter metadata_provider = await get_default_metadata_provider()
if source == 'civarchive':
from .metadata_service import get_metadata_provider
metadata_provider = await get_metadata_provider('civarchive')
else:
metadata_provider = await get_default_metadata_provider()
# Get version info based on the provided identifier # Get version info based on the provided identifier
version_info = await metadata_provider.get_model_version(model_id, model_version_id) version_info = await metadata_provider.get_model_version(model_id, model_version_id)
@@ -294,7 +325,7 @@ class DownloadManager:
await progress_callback(0) await progress_callback(0)
# 2. Get file information # 2. Get file information
file_info = next((f for f in version_info.get('files', []) if f.get('primary')), None) file_info = next((f for f in version_info.get('files', []) if f.get('primary') and f.get('type') == 'Model'), None)
if not file_info: if not file_info:
return {'success': False, 'error': 'No primary file found in metadata'} return {'success': False, 'error': 'No primary file found in metadata'}
mirrors = file_info.get('mirrors') or [] mirrors = file_info.get('mirrors') or []
@@ -309,7 +340,7 @@ class DownloadManager:
download_urls.append(download_url) download_urls.append(download_url)
if not download_urls: if not download_urls:
return {'success': False, 'error': 'No download URL found for primary file'} return {'success': False, 'error': 'No mirror URL found'}
# 3. Prepare download # 3. Prepare download
file_name = file_info['name'] file_name = file_info['name']
@@ -335,7 +366,7 @@ class DownloadManager:
relative_path=relative_path, relative_path=relative_path,
progress_callback=progress_callback, progress_callback=progress_callback,
model_type=model_type, model_type=model_type,
download_id=download_id download_id=download_id,
) )
# If early_access_msg exists and download failed, replace error message # If early_access_msg exists and download failed, replace error message
@@ -387,16 +418,7 @@ class DownloadManager:
# Get model tags # Get model tags
model_tags = version_info.get('model', {}).get('tags', []) model_tags = version_info.get('model', {}).get('tags', [])
# Find the first Civitai model tag that exists in model_tags first_tag = settings_manager.resolve_priority_tag_for_model(model_tags, model_type)
first_tag = ''
for civitai_tag in CIVITAI_MODEL_TAGS:
if civitai_tag in model_tags:
first_tag = civitai_tag
break
# If no Civitai model tag found, fallback to first tag
if not first_tag and model_tags:
first_tag = model_tags[0]
# Format the template with available data # Format the template with available data
formatted_path = path_template formatted_path = path_template
@@ -409,10 +431,17 @@ class DownloadManager:
return formatted_path return formatted_path
async def _execute_download(self, download_urls: List[str], save_dir: str, async def _execute_download(
metadata, version_info: Dict, self,
relative_path: str, progress_callback=None, download_urls: List[str],
model_type: str = "lora", download_id: str = None) -> Dict: save_dir: str,
metadata,
version_info: Dict,
relative_path: str,
progress_callback=None,
model_type: str = "lora",
download_id: str = None,
) -> Dict:
"""Execute the actual download process including preview images and model files""" """Execute the actual download process including preview images and model files"""
try: try:
# Extract original filename details # Extract original filename details
@@ -444,6 +473,8 @@ class DownloadManager:
part_path = save_path + '.part' part_path = save_path + '.part'
metadata_path = os.path.splitext(save_path)[0] + '.metadata.json' metadata_path = os.path.splitext(save_path)[0] + '.metadata.json'
pause_event = self._pause_events.get(download_id) if download_id else None
# Store file paths in active_downloads for potential cleanup # Store file paths in active_downloads for potential cleanup
if download_id and download_id in self._active_downloads: if download_id and download_id in self._active_downloads:
self._active_downloads[download_id]['file_path'] = save_path self._active_downloads[download_id]['file_path'] = save_path
@@ -557,11 +588,22 @@ class DownloadManager:
last_error = None last_error = None
for download_url in download_urls: for download_url in download_urls:
use_auth = download_url.startswith("https://civitai.com/api/download/") use_auth = download_url.startswith("https://civitai.com/api/download/")
download_kwargs = {
"progress_callback": lambda progress, snapshot=None: self._handle_download_progress(
progress,
progress_callback,
snapshot,
),
"use_auth": use_auth, # Only use authentication for Civitai downloads
}
if pause_event is not None:
download_kwargs["pause_event"] = pause_event
success, result = await downloader.download_file( success, result = await downloader.download_file(
download_url, download_url,
save_path, # Use full path instead of separate dir and filename save_path, # Use full path instead of separate dir and filename
progress_callback=lambda p: self._handle_download_progress(p, progress_callback), **download_kwargs,
use_auth=use_auth # Only use authentication for Civitai downloads
) )
if success: if success:
@@ -640,17 +682,33 @@ class DownloadManager:
return {'success': False, 'error': str(e)} return {'success': False, 'error': str(e)}
async def _handle_download_progress(self, file_progress: float, progress_callback): async def _handle_download_progress(
"""Convert file download progress to overall progress self,
progress_update,
progress_callback,
snapshot=None,
):
"""Convert file download progress to overall progress."""
Args: if not progress_callback:
file_progress: Progress of file download (0-100) return
progress_callback: Callback function for progress updates
""" file_progress, original_snapshot = self._normalize_progress(progress_update, snapshot)
if progress_callback: overall_progress = 3 + (file_progress * 0.97)
# Scale file progress to 3-100 range (after preview download) overall_progress = max(0.0, min(overall_progress, 100.0))
overall_progress = 3 + (file_progress * 0.97) # 97% of progress for file download rounded_progress = round(overall_progress)
await progress_callback(round(overall_progress))
normalized_snapshot: Optional[DownloadProgress] = None
if original_snapshot is not None:
normalized_snapshot = DownloadProgress(
percent_complete=overall_progress,
bytes_downloaded=original_snapshot.bytes_downloaded,
total_bytes=original_snapshot.total_bytes,
bytes_per_second=original_snapshot.bytes_per_second,
timestamp=original_snapshot.timestamp,
)
await self._dispatch_progress(progress_callback, normalized_snapshot, rounded_progress)
async def cancel_download(self, download_id: str) -> Dict: async def cancel_download(self, download_id: str) -> Dict:
"""Cancel an active download by download_id """Cancel an active download by download_id
@@ -669,9 +727,14 @@ class DownloadManager:
task = self._download_tasks[download_id] task = self._download_tasks[download_id]
task.cancel() task.cancel()
pause_event = self._pause_events.get(download_id)
if pause_event is not None:
pause_event.set()
# Update status in active downloads # Update status in active downloads
if download_id in self._active_downloads: if download_id in self._active_downloads:
self._active_downloads[download_id]['status'] = 'cancelling' self._active_downloads[download_id]['status'] = 'cancelling'
self._active_downloads[download_id]['bytes_per_second'] = 0.0
# Wait briefly for the task to acknowledge cancellation # Wait briefly for the task to acknowledge cancellation
try: try:
@@ -734,6 +797,98 @@ class DownloadManager:
except Exception as e: except Exception as e:
logger.error(f"Error cancelling download: {e}", exc_info=True) logger.error(f"Error cancelling download: {e}", exc_info=True)
return {'success': False, 'error': str(e)} return {'success': False, 'error': str(e)}
finally:
self._pause_events.pop(download_id, None)
async def pause_download(self, download_id: str) -> Dict:
"""Pause an active download without losing progress."""
if download_id not in self._download_tasks:
return {'success': False, 'error': 'Download task not found'}
pause_event = self._pause_events.get(download_id)
if pause_event is None:
pause_event = asyncio.Event()
pause_event.set()
self._pause_events[download_id] = pause_event
if not pause_event.is_set():
return {'success': False, 'error': 'Download is already paused'}
pause_event.clear()
download_info = self._active_downloads.get(download_id)
if download_info is not None:
download_info['status'] = 'paused'
download_info['bytes_per_second'] = 0.0
return {'success': True, 'message': 'Download paused successfully'}
async def resume_download(self, download_id: str) -> Dict:
"""Resume a previously paused download."""
pause_event = self._pause_events.get(download_id)
if pause_event is None:
return {'success': False, 'error': 'Download task not found'}
if pause_event.is_set():
return {'success': False, 'error': 'Download is not paused'}
pause_event.set()
download_info = self._active_downloads.get(download_id)
if download_info is not None:
if download_info.get('status') == 'paused':
download_info['status'] = 'downloading'
download_info.setdefault('bytes_per_second', 0.0)
return {'success': True, 'message': 'Download resumed successfully'}
@staticmethod
def _coerce_progress_value(progress) -> float:
try:
return float(progress)
except (TypeError, ValueError):
return 0.0
@classmethod
def _normalize_progress(
cls,
progress,
snapshot: Optional[DownloadProgress] = None,
) -> Tuple[float, Optional[DownloadProgress]]:
if isinstance(progress, DownloadProgress):
return progress.percent_complete, progress
if isinstance(snapshot, DownloadProgress):
return snapshot.percent_complete, snapshot
if isinstance(progress, dict):
if 'percent_complete' in progress:
return cls._coerce_progress_value(progress['percent_complete']), snapshot
if 'progress' in progress:
return cls._coerce_progress_value(progress['progress']), snapshot
return cls._coerce_progress_value(progress), None
async def _dispatch_progress(
self,
callback,
snapshot: Optional[DownloadProgress],
progress_value: float,
) -> None:
try:
if snapshot is not None:
result = callback(snapshot, snapshot)
else:
result = callback(progress_value)
except TypeError:
result = callback(progress_value)
if inspect.isawaitable(result):
await result
elif asyncio.iscoroutine(result):
await result
async def get_active_downloads(self) -> Dict: async def get_active_downloads(self) -> Dict:
"""Get information about all active downloads """Get information about all active downloads
@@ -749,7 +904,10 @@ class DownloadManager:
'model_version_id': info.get('model_version_id'), 'model_version_id': info.get('model_version_id'),
'progress': info.get('progress', 0), 'progress': info.get('progress', 0),
'status': info.get('status', 'unknown'), 'status': info.get('status', 'unknown'),
'error': info.get('error', None) 'error': info.get('error', None),
'bytes_downloaded': info.get('bytes_downloaded', 0),
'total_bytes': info.get('total_bytes'),
'bytes_per_second': info.get('bytes_per_second', 0.0),
} }
for task_id, info in self._active_downloads.items() for task_id, info in self._active_downloads.items()
] ]

View File

@@ -14,13 +14,28 @@ import os
import logging import logging
import asyncio import asyncio
import aiohttp import aiohttp
from datetime import datetime from collections import deque
from typing import Optional, Dict, Tuple, Callable, Union from dataclasses import dataclass
from datetime import datetime, timedelta
from email.utils import parsedate_to_datetime
from typing import Optional, Dict, Tuple, Callable, Union, Awaitable
from ..services.settings_manager import get_settings_manager from ..services.settings_manager import get_settings_manager
from .errors import RateLimitError
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@dataclass(frozen=True)
class DownloadProgress:
"""Snapshot of a download transfer at a moment in time."""
percent_complete: float
bytes_downloaded: int
total_bytes: Optional[int]
bytes_per_second: float
timestamp: float
class Downloader: class Downloader:
"""Unified downloader for all HTTP/HTTPS downloads in the application.""" """Unified downloader for all HTTP/HTTPS downloads in the application."""
@@ -159,10 +174,11 @@ class Downloader:
self, self,
url: str, url: str,
save_path: str, save_path: str,
progress_callback: Optional[Callable[[float], None]] = None, progress_callback: Optional[Callable[..., Awaitable[None]]] = None,
use_auth: bool = False, use_auth: bool = False,
custom_headers: Optional[Dict[str, str]] = None, custom_headers: Optional[Dict[str, str]] = None,
allow_resume: bool = True allow_resume: bool = True,
pause_event: Optional[asyncio.Event] = None,
) -> Tuple[bool, str]: ) -> Tuple[bool, str]:
""" """
Download a file with resumable downloads and retry mechanism Download a file with resumable downloads and retry mechanism
@@ -174,6 +190,7 @@ class Downloader:
use_auth: Whether to include authentication headers (e.g., CivitAI API key) use_auth: Whether to include authentication headers (e.g., CivitAI API key)
custom_headers: Additional headers to include in request custom_headers: Additional headers to include in request
allow_resume: Whether to support resumable downloads allow_resume: Whether to support resumable downloads
pause_event: Optional event that, when cleared, will pause streaming until set again
Returns: Returns:
Tuple[bool, str]: (success, save_path or error message) Tuple[bool, str]: (success, save_path or error message)
@@ -248,7 +265,16 @@ class Downloader:
if allow_resume: if allow_resume:
os.rename(part_path, save_path) os.rename(part_path, save_path)
if progress_callback: if progress_callback:
await progress_callback(100) await self._dispatch_progress_callback(
progress_callback,
DownloadProgress(
percent_complete=100.0,
bytes_downloaded=part_size,
total_bytes=actual_size,
bytes_per_second=0.0,
timestamp=datetime.now().timestamp(),
),
)
return True, save_path return True, save_path
# Remove corrupted part file and restart # Remove corrupted part file and restart
os.remove(part_path) os.remove(part_path)
@@ -276,6 +302,8 @@ class Downloader:
current_size = resume_offset current_size = resume_offset
last_progress_report_time = datetime.now() last_progress_report_time = datetime.now()
progress_samples: deque[tuple[datetime, int]] = deque()
progress_samples.append((last_progress_report_time, current_size))
# Ensure directory exists # Ensure directory exists
os.makedirs(os.path.dirname(save_path), exist_ok=True) os.makedirs(os.path.dirname(save_path), exist_ok=True)
@@ -285,6 +313,8 @@ class Downloader:
mode = 'ab' if (allow_resume and resume_offset > 0) else 'wb' mode = 'ab' if (allow_resume and resume_offset > 0) else 'wb'
with open(part_path, mode) as f: with open(part_path, mode) as f:
async for chunk in response.content.iter_chunked(self.chunk_size): async for chunk in response.content.iter_chunked(self.chunk_size):
if pause_event is not None and not pause_event.is_set():
await pause_event.wait()
if chunk: if chunk:
# Run blocking file write in executor # Run blocking file write in executor
await loop.run_in_executor(None, f.write, chunk) await loop.run_in_executor(None, f.write, chunk)
@@ -294,9 +324,30 @@ class Downloader:
now = datetime.now() now = datetime.now()
time_diff = (now - last_progress_report_time).total_seconds() time_diff = (now - last_progress_report_time).total_seconds()
if progress_callback and total_size and time_diff >= 1.0: if progress_callback and time_diff >= 1.0:
progress = (current_size / total_size) * 100 progress_samples.append((now, current_size))
await progress_callback(progress) cutoff = now - timedelta(seconds=5)
while progress_samples and progress_samples[0][0] < cutoff:
progress_samples.popleft()
percent = (current_size / total_size) * 100 if total_size else 0.0
bytes_per_second = 0.0
if len(progress_samples) >= 2:
first_time, first_bytes = progress_samples[0]
last_time, last_bytes = progress_samples[-1]
elapsed = (last_time - first_time).total_seconds()
if elapsed > 0:
bytes_per_second = (last_bytes - first_bytes) / elapsed
progress_snapshot = DownloadProgress(
percent_complete=percent,
bytes_downloaded=current_size,
total_bytes=total_size or None,
bytes_per_second=bytes_per_second,
timestamp=now.timestamp(),
)
await self._dispatch_progress_callback(progress_callback, progress_snapshot)
last_progress_report_time = now last_progress_report_time = now
# Download completed successfully # Download completed successfully
@@ -331,7 +382,15 @@ class Downloader:
# Ensure 100% progress is reported # Ensure 100% progress is reported
if progress_callback: if progress_callback:
await progress_callback(100) final_snapshot = DownloadProgress(
percent_complete=100.0,
bytes_downloaded=final_size,
total_bytes=total_size or final_size,
bytes_per_second=0.0,
timestamp=datetime.now().timestamp(),
)
await self._dispatch_progress_callback(progress_callback, final_snapshot)
return True, save_path return True, save_path
@@ -364,6 +423,23 @@ class Downloader:
return False, f"Download failed after {self.max_retries + 1} attempts" return False, f"Download failed after {self.max_retries + 1} attempts"
async def _dispatch_progress_callback(
self,
progress_callback: Callable[..., Awaitable[None]],
snapshot: DownloadProgress,
) -> None:
"""Invoke a progress callback while preserving backward compatibility."""
try:
result = progress_callback(snapshot, snapshot)
except TypeError:
result = progress_callback(snapshot.percent_complete)
if asyncio.iscoroutine(result):
await result
elif hasattr(result, "__await__"):
await result
async def download_to_memory( async def download_to_memory(
self, self,
url: str, url: str,
@@ -513,6 +589,19 @@ class Downloader:
return False, "Access forbidden" return False, "Access forbidden"
elif response.status == 404: elif response.status == 404:
return False, "Resource not found" return False, "Resource not found"
elif response.status == 429:
retry_after = self._extract_retry_after(response.headers)
error_msg = "Request rate limited"
logger.warning(
"Rate limit encountered for %s %s; retry_after=%s",
method,
url,
retry_after,
)
return False, RateLimitError(
error_msg,
retry_after=retry_after,
)
else: else:
return False, f"Request failed with status {response.status}" return False, f"Request failed with status {response.status}"
@@ -534,6 +623,38 @@ class Downloader:
await self._create_session() await self._create_session()
logger.info("HTTP session refreshed due to settings change") logger.info("HTTP session refreshed due to settings change")
@staticmethod
def _extract_retry_after(headers) -> Optional[float]:
"""Parse the Retry-After header into seconds."""
if not headers:
return None
header_value = headers.get("Retry-After")
if not header_value:
return None
header_value = header_value.strip()
if not header_value:
return None
if header_value.isdigit():
try:
seconds = float(header_value)
except ValueError:
return None
return max(0.0, seconds)
try:
retry_datetime = parsedate_to_datetime(header_value)
except (TypeError, ValueError):
return None
if retry_datetime.tzinfo is None:
return None
delta = retry_datetime - datetime.now(tz=retry_datetime.tzinfo)
return max(0.0, delta.total_seconds())
# Global instance accessor # Global instance accessor
async def get_downloader() -> Downloader: async def get_downloader() -> Downloader:

View File

@@ -11,13 +11,14 @@ logger = logging.getLogger(__name__)
class EmbeddingService(BaseModelService): class EmbeddingService(BaseModelService):
"""Embedding-specific service implementation""" """Embedding-specific service implementation"""
def __init__(self, scanner): def __init__(self, scanner, update_service=None):
"""Initialize Embedding service """Initialize Embedding service
Args: Args:
scanner: Embedding scanner instance scanner: Embedding scanner instance
update_service: Optional service for remote update tracking.
""" """
super().__init__("embedding", scanner, EmbeddingMetadata) super().__init__("embedding", scanner, EmbeddingMetadata, update_service=update_service)
async def format_response(self, embedding_data: Dict) -> Dict: async def format_response(self, embedding_data: Dict) -> Dict:
"""Format Embedding data for API response""" """Format Embedding data for API response"""

21
py/services/errors.py Normal file
View File

@@ -0,0 +1,21 @@
"""Common service-level exception types."""
from __future__ import annotations
from typing import Optional
class RateLimitError(RuntimeError):
"""Raised when a remote provider rejects a request due to rate limiting."""
def __init__(
self,
message: str,
*,
retry_after: Optional[float] = None,
provider: Optional[str] = None,
) -> None:
super().__init__(message)
self.retry_after = retry_after
self.provider = provider

View File

@@ -11,13 +11,14 @@ logger = logging.getLogger(__name__)
class LoraService(BaseModelService): class LoraService(BaseModelService):
"""LoRA-specific service implementation""" """LoRA-specific service implementation"""
def __init__(self, scanner): def __init__(self, scanner, update_service=None):
"""Initialize LoRA service """Initialize LoRA service
Args: Args:
scanner: LoRA scanner instance scanner: LoRA scanner instance
update_service: Optional service for remote update tracking.
""" """
super().__init__("lora", scanner, LoraMetadata) super().__init__("lora", scanner, LoraMetadata, update_service=update_service)
async def format_response(self, lora_data: Dict) -> Dict: async def format_response(self, lora_data: Dict) -> Dict:
"""Format LoRA data for API response""" """Format LoRA data for API response"""

View File

@@ -3,7 +3,7 @@ import logging
import asyncio import asyncio
from pathlib import Path from pathlib import Path
from typing import Optional from typing import Optional
from .downloader import get_downloader from .downloader import get_downloader, DownloadProgress
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -77,9 +77,15 @@ class MetadataArchiveManager:
progress_callback("download", f"Downloading from {url}") progress_callback("download", f"Downloading from {url}")
# Custom progress callback to report download progress # Custom progress callback to report download progress
async def download_progress(progress): async def download_progress(progress, snapshot=None):
if progress_callback: if progress_callback:
progress_callback("download", f"Downloading archive... {progress:.1f}%") if isinstance(progress, DownloadProgress):
percent = progress.percent_complete
elif isinstance(snapshot, DownloadProgress):
percent = snapshot.percent_complete
else:
percent = float(progress or 0)
progress_callback("download", f"Downloading archive... {percent:.1f}%")
success, result = await downloader.download_file( success, result = await downloader.download_file(
url=url, url=url,

View File

@@ -1,9 +1,11 @@
import os import os
import logging import logging
from .model_metadata_provider import ( from .model_metadata_provider import (
ModelMetadataProvider,
ModelMetadataProviderManager, ModelMetadataProviderManager,
SQLiteModelMetadataProvider, SQLiteModelMetadataProvider,
CivitaiModelMetadataProvider, CivitaiModelMetadataProvider,
CivArchiveModelMetadataProvider,
FallbackMetadataProvider FallbackMetadataProvider
) )
from .settings_manager import get_settings_manager from .settings_manager import get_settings_manager
@@ -54,26 +56,27 @@ async def initialize_metadata_providers():
except Exception as e: except Exception as e:
logger.error(f"Failed to initialize Civitai API metadata provider: {e}") logger.error(f"Failed to initialize Civitai API metadata provider: {e}")
# Register CivArchive provider, but do NOT add to fallback providers # Register CivArchive provider, and all add to fallback providers
try: try:
from .model_metadata_provider import CivArchiveModelMetadataProvider civarchive_client = await ServiceRegistry.get_civarchive_client()
civarchive_provider = CivArchiveModelMetadataProvider() civarchive_provider = CivArchiveModelMetadataProvider(civarchive_client)
provider_manager.register_provider('civarchive', civarchive_provider) provider_manager.register_provider('civarchive_api', civarchive_provider)
logger.debug("CivArchive metadata provider registered (not included in fallback)") providers.append(('civarchive_api', civarchive_provider))
logger.debug("CivArchive metadata provider registered (also included in fallback)")
except Exception as e: except Exception as e:
logger.error(f"Failed to initialize CivArchive metadata provider: {e}") logger.error(f"Failed to initialize CivArchive metadata provider: {e}")
# Set up fallback provider based on available providers # Set up fallback provider based on available providers
if len(providers) > 1: if len(providers) > 1:
# Always use Civitai API first, then Archive DB # Always use Civitai API (it has better metadata), then CivArchive API, then Archive DB
ordered_providers = [] ordered_providers: list[tuple[str, ModelMetadataProvider]] = []
ordered_providers.extend([p[1] for p in providers if p[0] == 'civitai_api']) ordered_providers.extend([p for p in providers if p[0] == 'civitai_api'])
ordered_providers.extend([p[1] for p in providers if p[0] == 'sqlite']) ordered_providers.extend([p for p in providers if p[0] == 'civarchive_api'])
ordered_providers.extend([p for p in providers if p[0] == 'sqlite'])
if ordered_providers: if ordered_providers:
fallback_provider = FallbackMetadataProvider(ordered_providers) fallback_provider = FallbackMetadataProvider(ordered_providers)
provider_manager.register_provider('fallback', fallback_provider, is_default=True) provider_manager.register_provider('fallback', fallback_provider, is_default=True)
logger.debug(f"Fallback metadata provider registered with {len(ordered_providers)} providers, Civitai API first")
elif len(providers) == 1: elif len(providers) == 1:
# Only one provider available, set it as default # Only one provider available, set it as default
provider_name, provider = providers[0] provider_name, provider = providers[0]

View File

@@ -10,6 +10,7 @@ from typing import Any, Awaitable, Callable, Dict, Iterable, Optional
from ..services.settings_manager import SettingsManager from ..services.settings_manager import SettingsManager
from ..utils.model_utils import determine_base_model from ..utils.model_utils import determine_base_model
from .errors import RateLimitError
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -153,7 +154,12 @@ class MetadataSyncService:
model_data: Dict[str, Any], model_data: Dict[str, Any],
update_cache_func: Callable[[str, str, Dict[str, Any]], Awaitable[bool]], update_cache_func: Callable[[str, str, Dict[str, Any]], Awaitable[bool]],
) -> tuple[bool, Optional[str]]: ) -> tuple[bool, Optional[str]]:
"""Fetch metadata for a model and update both disk and cache state.""" """Fetch metadata for a model and update both disk and cache state.
Callers should hydrate ``model_data`` via ``MetadataManager.hydrate_model_data``
before invoking this method so that the persisted payload retains all known
metadata fields.
"""
if not isinstance(model_data, dict): if not isinstance(model_data, dict):
error = f"Invalid model_data type: {type(model_data)}" error = f"Invalid model_data type: {type(model_data)}"
@@ -162,42 +168,118 @@ class MetadataSyncService:
metadata_path = os.path.splitext(file_path)[0] + ".metadata.json" metadata_path = os.path.splitext(file_path)[0] + ".metadata.json"
enable_archive = self._settings.get("enable_metadata_archive_db", False) enable_archive = self._settings.get("enable_metadata_archive_db", False)
previous_source = model_data.get("metadata_source") or (model_data.get("civitai") or {}).get("source")
try: try:
provider_attempts: list[tuple[Optional[str], MetadataProviderProtocol]] = []
sqlite_attempted = False
if model_data.get("civitai_deleted") is True: if model_data.get("civitai_deleted") is True:
if not enable_archive or model_data.get("db_checked") is True: if previous_source in (None, "civarchive"):
try:
provider_attempts.append(("civarchive_api", await self._get_provider("civarchive_api")))
except Exception as exc: # pragma: no cover - provider resolution fault
logger.debug("Unable to resolve civarchive provider: %s", exc)
if enable_archive and model_data.get("db_checked") is not True:
try:
provider_attempts.append(("sqlite", await self._get_provider("sqlite")))
except Exception as exc: # pragma: no cover - provider resolution fault
logger.debug("Unable to resolve sqlite provider: %s", exc)
if not provider_attempts:
if not enable_archive: if not enable_archive:
error_msg = "CivitAI model is deleted and metadata archive DB is not enabled" error_msg = "CivitAI model is deleted and metadata archive DB is not enabled"
else: elif model_data.get("db_checked") is True:
error_msg = "CivitAI model is deleted and not found in metadata archive DB" error_msg = "CivitAI model is deleted and not found in metadata archive DB"
return (False, error_msg) else:
metadata_provider = await self._get_provider("sqlite") error_msg = "CivitAI model is deleted and no archive provider is available"
return False, error_msg
else: else:
metadata_provider = await self._get_default_provider() provider_attempts.append((None, await self._get_default_provider()))
civitai_metadata, error = await metadata_provider.get_model_by_hash(sha256) civitai_metadata: Optional[Dict[str, Any]] = None
if not civitai_metadata: metadata_provider: Optional[MetadataProviderProtocol] = None
if error == "Model not found": provider_used: Optional[str] = None
last_error: Optional[str] = None
for provider_name, provider in provider_attempts:
try:
civitai_metadata_candidate, error = await provider.get_model_by_hash(sha256)
except RateLimitError as exc:
exc.provider = exc.provider or (provider_name or provider.__class__.__name__)
raise
except Exception as exc: # pragma: no cover - defensive logging
logger.error("Provider %s failed for hash %s: %s", provider_name, sha256, exc)
civitai_metadata_candidate, error = None, str(exc)
if provider_name == "sqlite":
sqlite_attempted = True
if civitai_metadata_candidate:
civitai_metadata = civitai_metadata_candidate
metadata_provider = provider
provider_used = provider_name
break
last_error = error or last_error
if civitai_metadata is None or metadata_provider is None:
if sqlite_attempted:
model_data["db_checked"] = True
if last_error == "Model not found":
model_data["from_civitai"] = False model_data["from_civitai"] = False
model_data["civitai_deleted"] = True model_data["civitai_deleted"] = True
model_data["db_checked"] = enable_archive model_data["db_checked"] = sqlite_attempted or (enable_archive and model_data.get("db_checked", False))
model_data["last_checked_at"] = datetime.now().timestamp() model_data["last_checked_at"] = datetime.now().timestamp()
data_to_save = model_data.copy() data_to_save = model_data.copy()
data_to_save.pop("folder", None) data_to_save.pop("folder", None)
await self._metadata_manager.save_metadata(file_path, data_to_save) await self._metadata_manager.save_metadata(file_path, data_to_save)
default_error = (
"CivitAI model is deleted and metadata archive DB is not enabled"
if model_data.get("civitai_deleted") and not enable_archive
else "CivitAI model is deleted and not found in metadata archive DB"
if model_data.get("civitai_deleted") and (model_data.get("db_checked") is True or sqlite_attempted)
else "No provider returned metadata"
)
error_msg = ( error_msg = (
f"Error fetching metadata: {error} (model_name={model_data.get('model_name', '')})" f"Error fetching metadata: {last_error or default_error} "
f"(model_name={model_data.get('model_name', '')})"
) )
logger.error(error_msg) logger.error(error_msg)
return False, error_msg return False, error_msg
model_data["from_civitai"] = True model_data["from_civitai"] = True
model_data["civitai_deleted"] = civitai_metadata.get("source") == "archive_db" model_data["civitai_deleted"] = civitai_metadata.get("source") == "archive_db" or civitai_metadata.get("source") == "civarchive"
model_data["db_checked"] = enable_archive model_data["db_checked"] = enable_archive and (
civitai_metadata.get("source") == "archive_db" or sqlite_attempted
)
source = civitai_metadata.get("source") or "civitai_api"
if source == "api":
source = "civitai_api"
elif provider_used == "civarchive_api" and source != "civarchive":
source = "civarchive"
elif provider_used == "sqlite":
source = "archive_db"
model_data["metadata_source"] = source
model_data["last_checked_at"] = datetime.now().timestamp() model_data["last_checked_at"] = datetime.now().timestamp()
readable_source = {
"civitai_api": "CivitAI API",
"civarchive": "CivArchive API",
"archive_db": "Archive Database",
}.get(source, source)
logger.info(
"Fetched metadata for %s via %s",
model_data.get("model_name", ""),
readable_source,
)
local_metadata = model_data.copy() local_metadata = model_data.copy()
local_metadata.pop("folder", None) local_metadata.pop("folder", None)
@@ -221,6 +303,16 @@ class MetadataSyncService:
error_msg = f"Error fetching metadata - Missing key: {exc} in model_data={model_data}" error_msg = f"Error fetching metadata - Missing key: {exc} in model_data={model_data}"
logger.error(error_msg) logger.error(error_msg)
return False, error_msg return False, error_msg
except RateLimitError as exc:
provider_label = exc.provider or "metadata provider"
wait_hint = (
f"; retry after approximately {int(exc.retry_after)}s"
if exc.retry_after and exc.retry_after > 0
else ""
)
error_msg = f"Rate limited by {provider_label}{wait_hint}"
logger.warning(error_msg)
return False, error_msg
except Exception as exc: # pragma: no cover - error path except Exception as exc: # pragma: no cover - error path
error_msg = f"Error fetching metadata: {exc}" error_msg = f"Error fetching metadata: {exc}"
logger.error(error_msg, exc_info=True) logger.error(error_msg, exc_info=True)

View File

@@ -1,8 +1,11 @@
from abc import ABC, abstractmethod from abc import ABC, abstractmethod
import asyncio
import json import json
import logging import logging
from typing import Optional, Dict, Tuple, Any, List import random
from typing import Optional, Dict, Tuple, Any, List, Sequence
from .downloader import get_downloader from .downloader import get_downloader
from .errors import RateLimitError
try: try:
from bs4 import BeautifulSoup from bs4 import BeautifulSoup
@@ -88,122 +91,22 @@ class CivitaiModelMetadataProvider(ModelMetadataProvider):
return await self.client.get_user_models(username) return await self.client.get_user_models(username)
class CivArchiveModelMetadataProvider(ModelMetadataProvider): class CivArchiveModelMetadataProvider(ModelMetadataProvider):
"""Provider that uses CivArchive HTML page parsing for metadata""" """Provider that uses CivArchive API for metadata"""
def __init__(self, civarchive_client):
self.client = civarchive_client
async def get_model_by_hash(self, model_hash: str) -> Tuple[Optional[Dict], Optional[str]]: async def get_model_by_hash(self, model_hash: str) -> Tuple[Optional[Dict], Optional[str]]:
"""Not supported by CivArchive provider""" return await self.client.get_model_by_hash(model_hash)
return None, "CivArchive provider does not support hash lookup"
async def get_model_versions(self, model_id: str) -> Optional[Dict]: async def get_model_versions(self, model_id: str) -> Optional[Dict]:
"""Not supported by CivArchive provider""" return await self.client.get_model_versions(model_id)
return None
async def get_model_version(self, model_id: int = None, version_id: int = None) -> Optional[Dict]: async def get_model_version(self, model_id: int = None, version_id: int = None) -> Optional[Dict]:
"""Get specific model version by parsing CivArchive HTML page""" return await self.client.get_model_version(model_id, version_id)
if model_id is None or version_id is None:
return None
try:
# Construct CivArchive URL
url = f"https://civarchive.com/models/{model_id}?modelVersionId={version_id}"
downloader = await get_downloader()
session = await downloader.session
async with session.get(url) as response:
if response.status != 200:
return None
html_content = await response.text()
# Parse HTML to extract JSON data
soup_parser = _require_beautifulsoup()
soup = soup_parser(html_content, 'html.parser')
script_tag = soup.find('script', {'id': '__NEXT_DATA__', 'type': 'application/json'})
if not script_tag:
return None
# Parse JSON content
json_data = json.loads(script_tag.string)
model_data = json_data.get('props', {}).get('pageProps', {}).get('model')
if not model_data or 'version' not in model_data:
return None
# Extract version data as base
version = model_data['version'].copy()
# Restructure stats
if 'downloadCount' in version and 'ratingCount' in version and 'rating' in version:
version['stats'] = {
'downloadCount': version.pop('downloadCount'),
'ratingCount': version.pop('ratingCount'),
'rating': version.pop('rating')
}
# Rename trigger to trainedWords
if 'trigger' in version:
version['trainedWords'] = version.pop('trigger')
# Transform files data to expected format
if 'files' in version:
transformed_files = []
for file_data in version['files']:
# Find first available mirror (deletedAt is null)
available_mirror = None
for mirror in file_data.get('mirrors', []):
if mirror.get('deletedAt') is None:
available_mirror = mirror
break
# Create transformed file entry
transformed_file = {
'id': file_data.get('id'),
'sizeKB': file_data.get('sizeKB'),
'name': available_mirror.get('filename', file_data.get('name')) if available_mirror else file_data.get('name'),
'type': file_data.get('type'),
'downloadUrl': available_mirror.get('url') if available_mirror else None,
'primary': True,
'mirrors': file_data.get('mirrors', [])
}
# Transform hash format
if 'sha256' in file_data:
transformed_file['hashes'] = {
'SHA256': file_data['sha256'].upper()
}
transformed_files.append(transformed_file)
version['files'] = transformed_files
# Add model information
version['model'] = {
'name': model_data.get('name'),
'type': model_data.get('type'),
'nsfw': model_data.get('is_nsfw', False),
'description': model_data.get('description'),
'tags': model_data.get('tags', [])
}
version['creator'] = {
'username': model_data.get('username'),
'image': ''
}
# Add source identifier
version['source'] = 'civarchive'
version['is_deleted'] = json_data.get('query', {}).get('is_deleted', False)
return version
except Exception as e:
logger.error(f"Error fetching CivArchive model version {model_id}/{version_id}: {e}")
return None
async def get_model_version_info(self, version_id: str) -> Tuple[Optional[Dict], Optional[str]]: async def get_model_version_info(self, version_id: str) -> Tuple[Optional[Dict], Optional[str]]:
"""Not supported by CivArchive provider - requires both model_id and version_id""" return await self.client.get_model_version_info(version_id)
return None, "CivArchive provider requires both model_id and version_id"
async def get_user_models(self, username: str) -> Optional[List[Dict]]: async def get_user_models(self, username: str) -> Optional[List[Dict]]:
"""Not supported by CivArchive provider""" """Not supported by CivArchive provider"""
@@ -450,64 +353,166 @@ class SQLiteModelMetadataProvider(ModelMetadataProvider):
class FallbackMetadataProvider(ModelMetadataProvider): class FallbackMetadataProvider(ModelMetadataProvider):
"""Try providers in order, return first successful result.""" """Try providers in order, return first successful result."""
def __init__(self, providers: list):
self.providers = providers def __init__(
self,
providers: Sequence[ModelMetadataProvider | Tuple[str, ModelMetadataProvider]],
*,
rate_limit_retry_limit: int = 3,
rate_limit_base_delay: float = 1.5,
rate_limit_max_delay: float = 30.0,
rate_limit_jitter_ratio: float = 0.2,
) -> None:
self.providers: List[ModelMetadataProvider] = []
self._provider_labels: List[str] = []
for entry in providers:
if isinstance(entry, tuple) and len(entry) == 2:
name, provider = entry
else:
provider = entry
name = provider.__class__.__name__
self.providers.append(provider)
self._provider_labels.append(str(name))
self._rate_limit_retry_limit = max(1, rate_limit_retry_limit)
self._rate_limit_base_delay = rate_limit_base_delay
self._rate_limit_max_delay = rate_limit_max_delay
self._rate_limit_jitter_ratio = max(0.0, rate_limit_jitter_ratio)
async def get_model_by_hash(self, model_hash: str) -> Tuple[Optional[Dict], Optional[str]]: async def get_model_by_hash(self, model_hash: str) -> Tuple[Optional[Dict], Optional[str]]:
for provider in self.providers: for provider, label in self._iter_providers():
try: try:
result, error = await provider.get_model_by_hash(model_hash) result, error = await self._call_with_rate_limit(
label,
provider.get_model_by_hash,
model_hash,
)
if result: if result:
return result, error return result, error
except RateLimitError as exc:
exc.provider = exc.provider or label
raise exc
except Exception as e: except Exception as e:
logger.debug(f"Provider failed for get_model_by_hash: {e}") logger.debug("Provider %s failed for get_model_by_hash: %s", label, e)
continue continue
return None, "Model not found" return None, "Model not found"
async def get_model_versions(self, model_id: str) -> Optional[Dict]: async def get_model_versions(self, model_id: str) -> Optional[Dict]:
for provider in self.providers: for provider, label in self._iter_providers():
try: try:
result = await provider.get_model_versions(model_id) result = await self._call_with_rate_limit(
label,
provider.get_model_versions,
model_id,
)
if result: if result:
return result return result
except RateLimitError as exc:
exc.provider = exc.provider or label
raise exc
except Exception as e: except Exception as e:
logger.debug(f"Provider failed for get_model_versions: {e}") logger.debug("Provider %s failed for get_model_versions: %s", label, e)
continue continue
return None return None
async def get_model_version(self, model_id: int = None, version_id: int = None) -> Optional[Dict]: async def get_model_version(self, model_id: int = None, version_id: int = None) -> Optional[Dict]:
for provider in self.providers: for provider, label in self._iter_providers():
try: try:
result = await provider.get_model_version(model_id, version_id) result = await self._call_with_rate_limit(
label,
provider.get_model_version,
model_id,
version_id,
)
if result: if result:
return result return result
except RateLimitError as exc:
exc.provider = exc.provider or label
raise exc
except Exception as e: except Exception as e:
logger.debug(f"Provider failed for get_model_version: {e}") logger.debug("Provider %s failed for get_model_version: %s", label, e)
continue continue
return None return None
async def get_model_version_info(self, version_id: str) -> Tuple[Optional[Dict], Optional[str]]: async def get_model_version_info(self, version_id: str) -> Tuple[Optional[Dict], Optional[str]]:
for provider in self.providers: for provider, label in self._iter_providers():
try: try:
result, error = await provider.get_model_version_info(version_id) result, error = await self._call_with_rate_limit(
label,
provider.get_model_version_info,
version_id,
)
if result: if result:
return result, error return result, error
except RateLimitError as exc:
exc.provider = exc.provider or label
raise exc
except Exception as e: except Exception as e:
logger.debug(f"Provider failed for get_model_version_info: {e}") logger.debug("Provider %s failed for get_model_version_info: %s", label, e)
continue continue
return None, "No provider could retrieve the data" return None, "No provider could retrieve the data"
async def get_user_models(self, username: str) -> Optional[List[Dict]]: async def get_user_models(self, username: str) -> Optional[List[Dict]]:
for provider in self.providers: for provider, label in self._iter_providers():
try: try:
result = await provider.get_user_models(username) result = await self._call_with_rate_limit(
label,
provider.get_user_models,
username,
)
if result is not None: if result is not None:
return result return result
except RateLimitError as exc:
exc.provider = exc.provider or label
raise exc
except Exception as e: except Exception as e:
logger.debug(f"Provider failed for get_user_models: {e}") logger.debug("Provider %s failed for get_user_models: %s", label, e)
continue continue
return None return None
def _iter_providers(self):
return zip(self.providers, self._provider_labels)
async def _call_with_rate_limit(
self,
label: str,
func,
*args,
**kwargs,
):
attempt = 0
while True:
try:
return await func(*args, **kwargs)
except RateLimitError as exc:
attempt += 1
if attempt >= self._rate_limit_retry_limit:
exc.provider = exc.provider or label
raise exc
delay = self._calculate_rate_limit_delay(exc.retry_after, attempt)
logger.warning(
"Provider %s rate limited request; retrying in %.2fs (attempt %s/%s)",
label,
delay,
attempt,
self._rate_limit_retry_limit,
)
await asyncio.sleep(delay)
except Exception:
raise
def _calculate_rate_limit_delay(self, retry_after: Optional[float], attempt: int) -> float:
if retry_after is not None:
return min(self._rate_limit_max_delay, max(0.0, retry_after))
base_delay = self._rate_limit_base_delay * (2 ** max(0, attempt - 1))
jitter_span = base_delay * self._rate_limit_jitter_ratio
if jitter_span > 0:
base_delay += random.uniform(-jitter_span, jitter_span)
return min(self._rate_limit_max_delay, max(0.0, base_delay))
class ModelMetadataProviderManager: class ModelMetadataProviderManager:
"""Manager for selecting and using model metadata providers""" """Manager for selecting and using model metadata providers"""

View File

@@ -119,6 +119,12 @@ class ModelScanner:
if value not in (None, '', []): if value not in (None, '', []):
slim[key] = value slim[key] = value
creator = civitai.get('creator')
if isinstance(creator, Mapping):
username = creator.get('username')
if username:
slim['creator'] = {'username': username}
trained_words = civitai.get('trainedWords') trained_words = civitai.get('trainedWords')
if trained_words: if trained_words:
slim['trainedWords'] = list(trained_words) if isinstance(trained_words, list) else trained_words slim['trainedWords'] = list(trained_words) if isinstance(trained_words, list) else trained_words
@@ -183,6 +189,7 @@ class ModelScanner:
'favorite': bool(get_value('favorite', False)), 'favorite': bool(get_value('favorite', False)),
'notes': notes, 'notes': notes,
'usage_tips': usage_tips, 'usage_tips': usage_tips,
'metadata_source': get_value('metadata_source', None),
'exclude': bool(get_value('exclude', False)), 'exclude': bool(get_value('exclude', False)),
'db_checked': bool(get_value('db_checked', False)), 'db_checked': bool(get_value('db_checked', False)),
'last_checked_at': float(get_value('last_checked_at', 0.0) or 0.0), 'last_checked_at': float(get_value('last_checked_at', 0.0) or 0.0),
@@ -617,6 +624,7 @@ class ModelScanner:
for i in range(0, len(new_files), batch_size): for i in range(0, len(new_files), batch_size):
batch = new_files[i:i+batch_size] batch = new_files[i:i+batch_size]
for path in batch: for path in batch:
logger.info(f"{self.model_type.capitalize()} Scanner: Processing {path}")
try: try:
# Find the appropriate root path for this file # Find the appropriate root path for this file
root_path = None root_path = None

View File

@@ -0,0 +1,411 @@
"""Service for tracking remote model version updates."""
from __future__ import annotations
import asyncio
import json
import logging
import os
import sqlite3
import time
from dataclasses import dataclass
from typing import Dict, Iterable, List, Mapping, Optional, Sequence
from .errors import RateLimitError
logger = logging.getLogger(__name__)
@dataclass
class ModelUpdateRecord:
"""Representation of a persisted update record."""
model_type: str
model_id: int
largest_version_id: Optional[int]
version_ids: List[int]
in_library_version_ids: List[int]
last_checked_at: Optional[float]
should_ignore: bool
def has_update(self) -> bool:
"""Return True when remote versions exceed the local library."""
if self.should_ignore or not self.version_ids:
return False
local_versions = set(self.in_library_version_ids)
return any(version_id not in local_versions for version_id in self.version_ids)
class ModelUpdateService:
"""Persist and query remote model version metadata."""
_SCHEMA = """
CREATE TABLE IF NOT EXISTS model_update_status (
model_type TEXT NOT NULL,
model_id INTEGER NOT NULL,
largest_version_id INTEGER,
version_ids TEXT,
in_library_version_ids TEXT,
last_checked_at REAL,
should_ignore INTEGER DEFAULT 0,
PRIMARY KEY (model_type, model_id)
)
"""
def __init__(self, db_path: str, *, ttl_seconds: int = 24 * 60 * 60) -> None:
self._db_path = db_path
self._ttl_seconds = ttl_seconds
self._lock = asyncio.Lock()
self._schema_initialized = False
self._ensure_directory()
self._initialize_schema()
def _ensure_directory(self) -> None:
directory = os.path.dirname(self._db_path)
if directory:
os.makedirs(directory, exist_ok=True)
def _connect(self) -> sqlite3.Connection:
conn = sqlite3.connect(self._db_path, check_same_thread=False)
conn.row_factory = sqlite3.Row
return conn
def _initialize_schema(self) -> None:
if self._schema_initialized:
return
try:
with self._connect() as conn:
conn.execute("PRAGMA journal_mode=WAL")
conn.execute("PRAGMA foreign_keys = ON")
conn.executescript(self._SCHEMA)
self._schema_initialized = True
except Exception as exc: # pragma: no cover - defensive guard
logger.error("Failed to initialize update schema: %s", exc, exc_info=True)
raise
async def refresh_for_model_type(
self,
model_type: str,
scanner,
metadata_provider,
*,
force_refresh: bool = False,
) -> Dict[int, ModelUpdateRecord]:
"""Refresh update information for every model present in the cache."""
local_versions = await self._collect_local_versions(scanner)
results: Dict[int, ModelUpdateRecord] = {}
for model_id, version_ids in local_versions.items():
record = await self._refresh_single_model(
model_type,
model_id,
version_ids,
metadata_provider,
force_refresh=force_refresh,
)
if record:
results[model_id] = record
return results
async def refresh_single_model(
self,
model_type: str,
model_id: int,
scanner,
metadata_provider,
*,
force_refresh: bool = False,
) -> Optional[ModelUpdateRecord]:
"""Refresh update information for a specific model id."""
local_versions = await self._collect_local_versions(scanner)
version_ids = local_versions.get(model_id, [])
return await self._refresh_single_model(
model_type,
model_id,
version_ids,
metadata_provider,
force_refresh=force_refresh,
)
async def update_in_library_versions(
self,
model_type: str,
model_id: int,
version_ids: Sequence[int],
) -> ModelUpdateRecord:
"""Persist a new set of in-library version identifiers."""
normalized_versions = self._normalize_sequence(version_ids)
async with self._lock:
existing = self._get_record(model_type, model_id)
record = ModelUpdateRecord(
model_type=model_type,
model_id=model_id,
largest_version_id=existing.largest_version_id if existing else None,
version_ids=list(existing.version_ids) if existing else [],
in_library_version_ids=normalized_versions,
last_checked_at=existing.last_checked_at if existing else None,
should_ignore=existing.should_ignore if existing else False,
)
self._upsert_record(record)
return record
async def set_should_ignore(
self, model_type: str, model_id: int, should_ignore: bool
) -> ModelUpdateRecord:
"""Toggle the ignore flag for a model."""
async with self._lock:
existing = self._get_record(model_type, model_id)
if existing:
record = ModelUpdateRecord(
model_type=model_type,
model_id=model_id,
largest_version_id=existing.largest_version_id,
version_ids=list(existing.version_ids),
in_library_version_ids=list(existing.in_library_version_ids),
last_checked_at=existing.last_checked_at,
should_ignore=should_ignore,
)
else:
record = ModelUpdateRecord(
model_type=model_type,
model_id=model_id,
largest_version_id=None,
version_ids=[],
in_library_version_ids=[],
last_checked_at=None,
should_ignore=should_ignore,
)
self._upsert_record(record)
return record
async def get_record(self, model_type: str, model_id: int) -> Optional[ModelUpdateRecord]:
"""Return a cached record without triggering remote fetches."""
async with self._lock:
return self._get_record(model_type, model_id)
async def has_update(self, model_type: str, model_id: int) -> bool:
"""Determine if a model has updates pending."""
record = await self.get_record(model_type, model_id)
return record.has_update() if record else False
async def _refresh_single_model(
self,
model_type: str,
model_id: int,
local_versions: Sequence[int],
metadata_provider,
*,
force_refresh: bool = False,
) -> Optional[ModelUpdateRecord]:
normalized_local = self._normalize_sequence(local_versions)
now = time.time()
async with self._lock:
existing = self._get_record(model_type, model_id)
if existing and existing.should_ignore and not force_refresh:
record = ModelUpdateRecord(
model_type=model_type,
model_id=model_id,
largest_version_id=existing.largest_version_id,
version_ids=list(existing.version_ids),
in_library_version_ids=normalized_local,
last_checked_at=existing.last_checked_at,
should_ignore=True,
)
self._upsert_record(record)
return record
should_fetch = force_refresh or not existing or self._is_stale(existing, now)
# release lock during network request
fetched_versions: List[int] | None = None
refresh_succeeded = False
if metadata_provider and should_fetch:
try:
response = await metadata_provider.get_model_versions(model_id)
except RateLimitError:
raise
except Exception as exc: # pragma: no cover - defensive log
logger.error(
"Failed to fetch versions for model %s (%s): %s",
model_id,
model_type,
exc,
exc_info=True,
)
else:
if response is not None:
extracted = self._extract_version_ids(response)
if extracted is not None:
fetched_versions = extracted
refresh_succeeded = True
async with self._lock:
existing = self._get_record(model_type, model_id)
if existing and existing.should_ignore and not force_refresh:
# Ignore state could have flipped while awaiting provider
record = ModelUpdateRecord(
model_type=model_type,
model_id=model_id,
largest_version_id=existing.largest_version_id,
version_ids=list(existing.version_ids),
in_library_version_ids=normalized_local,
last_checked_at=existing.last_checked_at,
should_ignore=True,
)
self._upsert_record(record)
return record
version_ids = (
fetched_versions
if refresh_succeeded
else (list(existing.version_ids) if existing else [])
)
largest = max(version_ids) if version_ids else None
last_checked = now if refresh_succeeded else (
existing.last_checked_at if existing else None
)
record = ModelUpdateRecord(
model_type=model_type,
model_id=model_id,
largest_version_id=largest,
version_ids=version_ids,
in_library_version_ids=normalized_local,
last_checked_at=last_checked,
should_ignore=existing.should_ignore if existing else False,
)
self._upsert_record(record)
return record
async def _collect_local_versions(self, scanner) -> Dict[int, List[int]]:
cache = await scanner.get_cached_data()
mapping: Dict[int, set[int]] = {}
if not cache or not getattr(cache, "raw_data", None):
return {}
for item in cache.raw_data:
civitai = item.get("civitai") if isinstance(item, dict) else None
if not isinstance(civitai, dict):
continue
model_id = self._normalize_int(civitai.get("modelId"))
version_id = self._normalize_int(civitai.get("id"))
if model_id is None or version_id is None:
continue
mapping.setdefault(model_id, set()).add(version_id)
return {model_id: sorted(ids) for model_id, ids in mapping.items()}
def _is_stale(self, record: ModelUpdateRecord, now: float) -> bool:
if record.last_checked_at is None:
return True
return (now - record.last_checked_at) >= self._ttl_seconds
@staticmethod
def _normalize_int(value) -> Optional[int]:
try:
if value is None:
return None
return int(value)
except (TypeError, ValueError):
return None
def _normalize_sequence(self, values: Sequence[int]) -> List[int]:
normalized = [
item
for item in (self._normalize_int(value) for value in values)
if item is not None
]
return sorted(dict.fromkeys(normalized))
def _extract_version_ids(self, response) -> Optional[List[int]]:
if not isinstance(response, Mapping):
return None
versions = response.get("modelVersions")
if versions is None:
return []
if not isinstance(versions, Iterable):
return None
normalized = []
for entry in versions:
if isinstance(entry, Mapping):
normalized_id = self._normalize_int(entry.get("id"))
else:
normalized_id = self._normalize_int(entry)
if normalized_id is not None:
normalized.append(normalized_id)
return sorted(dict.fromkeys(normalized))
def _get_record(self, model_type: str, model_id: int) -> Optional[ModelUpdateRecord]:
with self._connect() as conn:
row = conn.execute(
"""
SELECT model_type, model_id, largest_version_id, version_ids,
in_library_version_ids, last_checked_at, should_ignore
FROM model_update_status
WHERE model_type = ? AND model_id = ?
""",
(model_type, model_id),
).fetchone()
if not row:
return None
return ModelUpdateRecord(
model_type=row["model_type"],
model_id=int(row["model_id"]),
largest_version_id=self._normalize_int(row["largest_version_id"]),
version_ids=self._deserialize_json_array(row["version_ids"]),
in_library_version_ids=self._deserialize_json_array(
row["in_library_version_ids"]
),
last_checked_at=row["last_checked_at"],
should_ignore=bool(row["should_ignore"]),
)
def _upsert_record(self, record: ModelUpdateRecord) -> None:
payload = (
record.model_type,
record.model_id,
record.largest_version_id,
json.dumps(record.version_ids),
json.dumps(record.in_library_version_ids),
record.last_checked_at,
1 if record.should_ignore else 0,
)
with self._connect() as conn:
conn.execute(
"""
INSERT INTO model_update_status (
model_type, model_id, largest_version_id, version_ids,
in_library_version_ids, last_checked_at, should_ignore
) VALUES (?, ?, ?, ?, ?, ?, ?)
ON CONFLICT(model_type, model_id) DO UPDATE SET
largest_version_id = excluded.largest_version_id,
version_ids = excluded.version_ids,
in_library_version_ids = excluded.in_library_version_ids,
last_checked_at = excluded.last_checked_at,
should_ignore = excluded.should_ignore
""",
payload,
)
conn.commit()
@staticmethod
def _deserialize_json_array(value) -> List[int]:
if not value:
return []
try:
data = json.loads(value)
except (TypeError, json.JSONDecodeError):
return []
if isinstance(data, list):
normalized = []
for entry in data:
try:
normalized.append(int(entry))
except (TypeError, ValueError):
continue
return sorted(dict.fromkeys(normalized))
return []

View File

@@ -25,6 +25,34 @@ class PersistentModelCache:
"""Persist core model metadata and hash index data in SQLite.""" """Persist core model metadata and hash index data in SQLite."""
_DEFAULT_FILENAME = "model_cache.sqlite" _DEFAULT_FILENAME = "model_cache.sqlite"
_MODEL_COLUMNS: Tuple[str, ...] = (
"model_type",
"file_path",
"file_name",
"model_name",
"folder",
"size",
"modified",
"sha256",
"base_model",
"preview_url",
"preview_nsfw_level",
"from_civitai",
"favorite",
"notes",
"usage_tips",
"metadata_source",
"civitai_id",
"civitai_model_id",
"civitai_name",
"civitai_creator_username",
"trained_words",
"civitai_deleted",
"exclude",
"db_checked",
"last_checked_at",
)
_MODEL_UPDATE_COLUMNS: Tuple[str, ...] = _MODEL_COLUMNS[2:]
_instances: Dict[str, "PersistentModelCache"] = {} _instances: Dict[str, "PersistentModelCache"] = {}
_instance_lock = threading.Lock() _instance_lock = threading.Lock()
@@ -53,6 +81,11 @@ class PersistentModelCache:
def is_enabled(self) -> bool: def is_enabled(self) -> bool:
return os.environ.get("LORA_MANAGER_DISABLE_PERSISTENT_CACHE", "0") != "1" return os.environ.get("LORA_MANAGER_DISABLE_PERSISTENT_CACHE", "0") != "1"
def get_database_path(self) -> str:
"""Expose the resolved SQLite database path."""
return self._db_path
def load_cache(self, model_type: str) -> Optional[PersistedCacheData]: def load_cache(self, model_type: str) -> Optional[PersistedCacheData]:
if not self.is_enabled(): if not self.is_enabled():
return None return None
@@ -64,12 +97,9 @@ class PersistentModelCache:
with self._db_lock: with self._db_lock:
conn = self._connect(readonly=True) conn = self._connect(readonly=True)
try: try:
model_columns_sql = ", ".join(self._MODEL_COLUMNS[1:])
rows = conn.execute( rows = conn.execute(
"SELECT file_path, file_name, model_name, folder, size, modified, sha256, base_model," f"SELECT {model_columns_sql} FROM models WHERE model_type = ?",
" preview_url, preview_nsfw_level, from_civitai, favorite, notes, usage_tips,"
" civitai_id, civitai_model_id, civitai_name, trained_words, exclude, db_checked,"
" last_checked_at"
" FROM models WHERE model_type = ?",
(model_type,), (model_type,),
).fetchall() ).fetchall()
@@ -101,8 +131,12 @@ class PersistentModelCache:
except json.JSONDecodeError: except json.JSONDecodeError:
trained_words = [] trained_words = []
creator_username = row["civitai_creator_username"]
civitai: Optional[Dict] = None civitai: Optional[Dict] = None
if any(row[col] is not None for col in ("civitai_id", "civitai_model_id", "civitai_name")): civitai_has_data = any(
row[col] is not None for col in ("civitai_id", "civitai_model_id", "civitai_name")
) or trained_words or creator_username
if civitai_has_data:
civitai = {} civitai = {}
if row["civitai_id"] is not None: if row["civitai_id"] is not None:
civitai["id"] = row["civitai_id"] civitai["id"] = row["civitai_id"]
@@ -112,6 +146,8 @@ class PersistentModelCache:
civitai["name"] = row["civitai_name"] civitai["name"] = row["civitai_name"]
if trained_words: if trained_words:
civitai["trainedWords"] = trained_words civitai["trainedWords"] = trained_words
if creator_username:
civitai.setdefault("creator", {})["username"] = creator_username
item = { item = {
"file_path": file_path, "file_path": file_path,
@@ -128,11 +164,13 @@ class PersistentModelCache:
"favorite": bool(row["favorite"]), "favorite": bool(row["favorite"]),
"notes": row["notes"] or "", "notes": row["notes"] or "",
"usage_tips": row["usage_tips"] or "", "usage_tips": row["usage_tips"] or "",
"metadata_source": row["metadata_source"] or None,
"exclude": bool(row["exclude"]), "exclude": bool(row["exclude"]),
"db_checked": bool(row["db_checked"]), "db_checked": bool(row["db_checked"]),
"last_checked_at": row["last_checked_at"] or 0.0, "last_checked_at": row["last_checked_at"] or 0.0,
"tags": tags.get(file_path, []), "tags": tags.get(file_path, []),
"civitai": civitai, "civitai": civitai,
"civitai_deleted": bool(row["civitai_deleted"]),
} }
raw_data.append(item) raw_data.append(item)
@@ -159,45 +197,190 @@ class PersistentModelCache:
conn = self._connect() conn = self._connect()
try: try:
conn.execute("PRAGMA foreign_keys = ON") conn.execute("PRAGMA foreign_keys = ON")
conn.execute("DELETE FROM models WHERE model_type = ?", (model_type,)) conn.execute("BEGIN")
conn.execute("DELETE FROM model_tags WHERE model_type = ?", (model_type,))
conn.execute("DELETE FROM hash_index WHERE model_type = ?", (model_type,))
conn.execute("DELETE FROM excluded_models WHERE model_type = ?", (model_type,))
model_rows = [self._prepare_model_row(model_type, item) for item in raw_data] model_rows = [self._prepare_model_row(model_type, item) for item in raw_data]
conn.executemany(self._insert_model_sql(), model_rows) model_map: Dict[str, Tuple] = {
row[1]: row for row in model_rows if row[1] # row[1] is file_path
}
tag_rows = [] existing_models = conn.execute(
"SELECT "
+ ", ".join(self._MODEL_COLUMNS[1:])
+ " FROM models WHERE model_type = ?",
(model_type,),
).fetchall()
existing_model_map: Dict[str, sqlite3.Row] = {
row["file_path"]: row for row in existing_models
}
to_remove_models = [
(model_type, path)
for path in existing_model_map.keys()
if path not in model_map
]
if to_remove_models:
conn.executemany(
"DELETE FROM models WHERE model_type = ? AND file_path = ?",
to_remove_models,
)
conn.executemany(
"DELETE FROM model_tags WHERE model_type = ? AND file_path = ?",
to_remove_models,
)
conn.executemany(
"DELETE FROM hash_index WHERE model_type = ? AND file_path = ?",
to_remove_models,
)
conn.executemany(
"DELETE FROM excluded_models WHERE model_type = ? AND file_path = ?",
to_remove_models,
)
insert_rows: List[Tuple] = []
update_rows: List[Tuple] = []
for file_path, row in model_map.items():
existing = existing_model_map.get(file_path)
if existing is None:
insert_rows.append(row)
continue
existing_values = tuple(
existing[column] for column in self._MODEL_COLUMNS[1:]
)
current_values = row[1:]
if existing_values != current_values:
update_rows.append(row[2:] + (model_type, file_path))
if insert_rows:
conn.executemany(self._insert_model_sql(), insert_rows)
if update_rows:
set_clause = ", ".join(
f"{column} = ?"
for column in self._MODEL_UPDATE_COLUMNS
)
update_sql = (
f"UPDATE models SET {set_clause} WHERE model_type = ? AND file_path = ?"
)
conn.executemany(update_sql, update_rows)
existing_tags_rows = conn.execute(
"SELECT file_path, tag FROM model_tags WHERE model_type = ?",
(model_type,),
).fetchall()
existing_tags: Dict[str, set] = {}
for row in existing_tags_rows:
existing_tags.setdefault(row["file_path"], set()).add(row["tag"])
new_tags: Dict[str, set] = {}
for item in raw_data: for item in raw_data:
file_path = item.get("file_path") file_path = item.get("file_path")
if not file_path: if not file_path:
continue continue
for tag in item.get("tags") or []: tags = set(item.get("tags") or [])
tag_rows.append((model_type, file_path, tag)) if tags:
if tag_rows: new_tags[file_path] = tags
tag_inserts: List[Tuple[str, str, str]] = []
tag_deletes: List[Tuple[str, str, str]] = []
all_tag_paths = set(existing_tags.keys()) | set(new_tags.keys())
for path in all_tag_paths:
existing_set = existing_tags.get(path, set())
new_set = new_tags.get(path, set())
to_add = new_set - existing_set
to_remove = existing_set - new_set
for tag in to_add:
tag_inserts.append((model_type, path, tag))
for tag in to_remove:
tag_deletes.append((model_type, path, tag))
if tag_deletes:
conn.executemany(
"DELETE FROM model_tags WHERE model_type = ? AND file_path = ? AND tag = ?",
tag_deletes,
)
if tag_inserts:
conn.executemany( conn.executemany(
"INSERT INTO model_tags (model_type, file_path, tag) VALUES (?, ?, ?)", "INSERT INTO model_tags (model_type, file_path, tag) VALUES (?, ?, ?)",
tag_rows, tag_inserts,
) )
hash_rows: List[Tuple[str, str, str]] = [] existing_hash_rows = conn.execute(
"SELECT sha256, file_path FROM hash_index WHERE model_type = ?",
(model_type,),
).fetchall()
existing_hash_map: Dict[str, set] = {}
for row in existing_hash_rows:
sha_value = (row["sha256"] or "").lower()
if not sha_value:
continue
existing_hash_map.setdefault(sha_value, set()).add(row["file_path"])
new_hash_map: Dict[str, set] = {}
for sha_value, paths in hash_index.items(): for sha_value, paths in hash_index.items():
normalized_sha = (sha_value or "").lower()
if not normalized_sha:
continue
bucket = new_hash_map.setdefault(normalized_sha, set())
for path in paths: for path in paths:
if not sha_value or not path: if path:
continue bucket.add(path)
hash_rows.append((model_type, sha_value.lower(), path))
if hash_rows: hash_inserts: List[Tuple[str, str, str]] = []
hash_deletes: List[Tuple[str, str, str]] = []
all_shas = set(existing_hash_map.keys()) | set(new_hash_map.keys())
for sha_value in all_shas:
existing_paths = existing_hash_map.get(sha_value, set())
new_paths = new_hash_map.get(sha_value, set())
for path in existing_paths - new_paths:
hash_deletes.append((model_type, sha_value, path))
for path in new_paths - existing_paths:
hash_inserts.append((model_type, sha_value, path))
if hash_deletes:
conn.executemany(
"DELETE FROM hash_index WHERE model_type = ? AND sha256 = ? AND file_path = ?",
hash_deletes,
)
if hash_inserts:
conn.executemany( conn.executemany(
"INSERT OR IGNORE INTO hash_index (model_type, sha256, file_path) VALUES (?, ?, ?)", "INSERT OR IGNORE INTO hash_index (model_type, sha256, file_path) VALUES (?, ?, ?)",
hash_rows, hash_inserts,
) )
excluded_rows = [(model_type, path) for path in excluded_models] existing_excluded_rows = conn.execute(
if excluded_rows: "SELECT file_path FROM excluded_models WHERE model_type = ?",
(model_type,),
).fetchall()
existing_excluded = {row["file_path"] for row in existing_excluded_rows}
new_excluded = {path for path in excluded_models if path}
excluded_deletes = [
(model_type, path)
for path in existing_excluded - new_excluded
]
excluded_inserts = [
(model_type, path)
for path in new_excluded - existing_excluded
]
if excluded_deletes:
conn.executemany(
"DELETE FROM excluded_models WHERE model_type = ? AND file_path = ?",
excluded_deletes,
)
if excluded_inserts:
conn.executemany( conn.executemany(
"INSERT OR IGNORE INTO excluded_models (model_type, file_path) VALUES (?, ?)", "INSERT OR IGNORE INTO excluded_models (model_type, file_path) VALUES (?, ?)",
excluded_rows, excluded_inserts,
) )
conn.commit() conn.commit()
finally: finally:
conn.close() conn.close()
@@ -248,10 +431,13 @@ class PersistentModelCache:
favorite INTEGER, favorite INTEGER,
notes TEXT, notes TEXT,
usage_tips TEXT, usage_tips TEXT,
metadata_source TEXT,
civitai_id INTEGER, civitai_id INTEGER,
civitai_model_id INTEGER, civitai_model_id INTEGER,
civitai_name TEXT, civitai_name TEXT,
civitai_creator_username TEXT,
trained_words TEXT, trained_words TEXT,
civitai_deleted INTEGER,
exclude INTEGER, exclude INTEGER,
db_checked INTEGER, db_checked INTEGER,
last_checked_at REAL, last_checked_at REAL,
@@ -279,11 +465,31 @@ class PersistentModelCache:
); );
""" """
) )
self._ensure_additional_model_columns(conn)
conn.commit() conn.commit()
self._schema_initialized = True self._schema_initialized = True
except Exception as exc: # pragma: no cover - defensive guard except Exception as exc: # pragma: no cover - defensive guard
logger.warning("Failed to initialize persistent cache schema: %s", exc) logger.warning("Failed to initialize persistent cache schema: %s", exc)
def _ensure_additional_model_columns(self, conn: sqlite3.Connection) -> None:
try:
existing_columns = {
row["name"]
for row in conn.execute("PRAGMA table_info(models)").fetchall()
}
except Exception: # pragma: no cover - defensive guard
return
required_columns = {
"metadata_source": "TEXT",
"civitai_creator_username": "TEXT",
"civitai_deleted": "INTEGER DEFAULT 0",
}
for column, definition in required_columns.items():
if column not in existing_columns:
conn.execute(f"ALTER TABLE models ADD COLUMN {column} {definition}")
def _connect(self, readonly: bool = False) -> sqlite3.Connection: def _connect(self, readonly: bool = False) -> sqlite3.Connection:
uri = False uri = False
path = self._db_path path = self._db_path
@@ -306,6 +512,12 @@ class PersistentModelCache:
else: else:
trained_words_json = json.dumps(trained_words) trained_words_json = json.dumps(trained_words)
metadata_source = item.get("metadata_source") or None
creator_username = None
creator_data = civitai.get("creator") if isinstance(civitai, dict) else None
if isinstance(creator_data, dict):
creator_username = creator_data.get("username") or None
return ( return (
model_type, model_type,
item.get("file_path"), item.get("file_path"),
@@ -322,22 +534,22 @@ class PersistentModelCache:
1 if item.get("favorite") else 0, 1 if item.get("favorite") else 0,
item.get("notes"), item.get("notes"),
item.get("usage_tips"), item.get("usage_tips"),
metadata_source,
civitai.get("id"), civitai.get("id"),
civitai.get("modelId"), civitai.get("modelId"),
civitai.get("name"), civitai.get("name"),
creator_username,
trained_words_json, trained_words_json,
1 if item.get("civitai_deleted") else 0,
1 if item.get("exclude") else 0, 1 if item.get("exclude") else 0,
1 if item.get("db_checked") else 0, 1 if item.get("db_checked") else 0,
float(item.get("last_checked_at") or 0.0), float(item.get("last_checked_at") or 0.0),
) )
def _insert_model_sql(self) -> str: def _insert_model_sql(self) -> str:
return ( columns = ", ".join(self._MODEL_COLUMNS)
"INSERT INTO models (model_type, file_path, file_name, model_name, folder, size, modified, sha256," placeholders = ", ".join(["?"] * len(self._MODEL_COLUMNS))
" base_model, preview_url, preview_nsfw_level, from_civitai, favorite, notes, usage_tips," return f"INSERT INTO models ({columns}) VALUES ({placeholders})"
" civitai_id, civitai_model_id, civitai_name, trained_words, exclude, db_checked, last_checked_at)"
" VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)"
)
def _load_tags(self, conn: sqlite3.Connection, model_type: str) -> Dict[str, List[str]]: def _load_tags(self, conn: sqlite3.Connection, model_type: str) -> Dict[str, List[str]]:
tag_rows = conn.execute( tag_rows = conn.execute(

View File

@@ -61,7 +61,7 @@ class RecipeSharingService:
safe_title = recipe.get("title", "").replace(" ", "_").lower() safe_title = recipe.get("title", "").replace(" ", "_").lower()
filename = f"recipe_{safe_title}{ext}" if safe_title else f"recipe_{recipe_id}{ext}" filename = f"recipe_{safe_title}{ext}" if safe_title else f"recipe_{recipe_id}{ext}"
url_path = f"/api/recipe/{recipe_id}/share/download?t={timestamp}" url_path = f"/api/lm/recipe/{recipe_id}/share/download?t={timestamp}"
return SharingResult({"success": True, "download_url": url_path, "filename": filename}) return SharingResult({"success": True, "download_url": url_path, "filename": filename})
async def prepare_download(self, *, recipe_scanner, recipe_id: str) -> DownloadInfo: async def prepare_download(self, *, recipe_scanner, recipe_id: str) -> DownloadInfo:

View File

@@ -145,6 +145,49 @@ class ServiceRegistry:
logger.debug(f"Created and registered {service_name}") logger.debug(f"Created and registered {service_name}")
return client return client
@classmethod
async def get_model_update_service(cls):
"""Get or create the model update tracking service."""
service_name = "model_update_service"
if service_name in cls._services:
return cls._services[service_name]
async with cls._get_lock(service_name):
if service_name in cls._services:
return cls._services[service_name]
from .model_update_service import ModelUpdateService
from .persistent_model_cache import get_persistent_cache
cache = get_persistent_cache()
service = ModelUpdateService(cache.get_database_path())
cls._services[service_name] = service
logger.debug(f"Created and registered {service_name}")
return service
@classmethod
async def get_civarchive_client(cls):
"""Get or create CivArchive client instance"""
service_name = "civarchive_client"
if service_name in cls._services:
return cls._services[service_name]
async with cls._get_lock(service_name):
# Double-check after acquiring lock
if service_name in cls._services:
return cls._services[service_name]
# Import here to avoid circular imports
from .civarchive_client import CivArchiveClient
client = await CivArchiveClient.get_instance()
cls._services[service_name] = client
logger.debug(f"Created and registered {service_name}")
return client
@classmethod @classmethod
async def get_download_manager(cls): async def get_download_manager(cls):
"""Get or create Download manager instance""" """Get or create Download manager instance"""

View File

@@ -4,9 +4,16 @@ import os
import logging import logging
from datetime import datetime, timezone from datetime import datetime, timezone
from threading import Lock from threading import Lock
from typing import Any, Dict, Iterable, List, Mapping, Optional from typing import Any, Dict, Iterable, List, Mapping, Optional, Sequence
from ..utils.constants import DEFAULT_PRIORITY_TAG_CONFIG
from ..utils.settings_paths import ensure_settings_file from ..utils.settings_paths import ensure_settings_file
from ..utils.tag_priorities import (
PriorityTagEntry,
collect_canonical_tags,
parse_priority_tag_string,
resolve_priority_tag,
)
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -36,6 +43,8 @@ DEFAULT_SETTINGS: Dict[str, Any] = {
"card_info_display": "always", "card_info_display": "always",
"include_trigger_words": False, "include_trigger_words": False,
"compact_mode": False, "compact_mode": False,
"priority_tags": DEFAULT_PRIORITY_TAG_CONFIG.copy(),
"model_name_display": "model_name",
} }
@@ -63,6 +72,12 @@ class SettingsManager:
def _ensure_default_settings(self) -> None: def _ensure_default_settings(self) -> None:
"""Ensure all default settings keys exist""" """Ensure all default settings keys exist"""
updated = False updated = False
normalized_priority = self._normalize_priority_tag_config(
self.settings.get("priority_tags")
)
if normalized_priority != self.settings.get("priority_tags"):
self.settings["priority_tags"] = normalized_priority
updated = True
for key, value in self._get_default_settings().items(): for key, value in self._get_default_settings().items():
if key not in self.settings: if key not in self.settings:
if isinstance(value, dict): if isinstance(value, dict):
@@ -385,8 +400,56 @@ class SettingsManager:
# Ensure nested dicts are independent copies # Ensure nested dicts are independent copies
defaults['base_model_path_mappings'] = {} defaults['base_model_path_mappings'] = {}
defaults['download_path_templates'] = {} defaults['download_path_templates'] = {}
defaults['priority_tags'] = DEFAULT_PRIORITY_TAG_CONFIG.copy()
return defaults return defaults
def _normalize_priority_tag_config(self, value: Any) -> Dict[str, str]:
normalized: Dict[str, str] = {}
if isinstance(value, Mapping):
for key, raw in value.items():
if not isinstance(key, str) or not isinstance(raw, str):
continue
normalized[key] = raw.strip()
for model_type, default_value in DEFAULT_PRIORITY_TAG_CONFIG.items():
normalized.setdefault(model_type, default_value)
return normalized
def get_priority_tag_config(self) -> Dict[str, str]:
stored_value = self.settings.get("priority_tags")
normalized = self._normalize_priority_tag_config(stored_value)
if normalized != stored_value:
self.settings["priority_tags"] = normalized
self._save_settings()
return normalized.copy()
def get_priority_tag_entries(self, model_type: str) -> List[PriorityTagEntry]:
config = self.get_priority_tag_config()
raw_config = config.get(model_type, "")
return parse_priority_tag_string(raw_config)
def resolve_priority_tag_for_model(
self, tags: Sequence[str] | Iterable[str], model_type: str
) -> str:
entries = self.get_priority_tag_entries(model_type)
resolved = resolve_priority_tag(tags, entries)
if resolved:
return resolved
for tag in tags:
if isinstance(tag, str) and tag:
return tag
return ""
def get_priority_tag_suggestions(self) -> Dict[str, List[str]]:
suggestions: Dict[str, List[str]] = {}
config = self.get_priority_tag_config()
for model_type, raw_value in config.items():
entries = parse_priority_tag_string(raw_value)
suggestions[model_type] = collect_canonical_tags(entries)
return suggestions
def get(self, key: str, default: Any = None) -> Any: def get(self, key: str, default: Any = None) -> Any:
"""Get setting value""" """Get setting value"""
return self.settings.get(key, default) return self.settings.get(key, default)

View File

@@ -155,12 +155,22 @@ class WebSocketManager:
async def broadcast_download_progress(self, download_id: str, data: Dict): async def broadcast_download_progress(self, download_id: str, data: Dict):
"""Send progress update to specific download client""" """Send progress update to specific download client"""
# Store simplified progress data in memory (only progress percentage) progress_entry = {
self._download_progress[download_id] = {
'progress': data.get('progress', 0), 'progress': data.get('progress', 0),
'timestamp': datetime.now() 'timestamp': datetime.now(),
} }
for field in ('bytes_downloaded', 'total_bytes', 'bytes_per_second'):
if field in data:
progress_entry[field] = data[field]
if 'status' in data:
progress_entry['status'] = data['status']
if 'message' in data:
progress_entry['message'] = data['message']
self._download_progress[download_id] = progress_entry
if download_id not in self._download_websockets: if download_id not in self._download_websockets:
logger.debug(f"No WebSocket found for download ID: {download_id}") logger.debug(f"No WebSocket found for download ID: {download_id}")
return return

View File

@@ -40,7 +40,6 @@ def rewrite_preview_url(source_url: str | None, media_type: str | None = None) -
return source_url, False return source_url, False
rewritten = urlunparse(parsed._replace(path=updated_path)) rewritten = urlunparse(parsed._replace(path=updated_path))
print(rewritten)
return rewritten, True return rewritten, True

View File

@@ -65,3 +65,10 @@ CIVITAI_MODEL_TAGS = [
'poses', 'background', 'tool', 'vehicle', 'buildings', 'poses', 'background', 'tool', 'vehicle', 'buildings',
'objects', 'assets', 'animal', 'action' 'objects', 'assets', 'animal', 'action'
] ]
# Default priority tag configuration strings for each model type
DEFAULT_PRIORITY_TAG_CONFIG = {
'lora': ', '.join(CIVITAI_MODEL_TAGS),
'checkpoint': ', '.join(CIVITAI_MODEL_TAGS),
'embedding': ', '.join(CIVITAI_MODEL_TAGS),
}

View File

@@ -105,6 +105,7 @@ class DownloadManager:
self._progress = _DownloadProgress() self._progress = _DownloadProgress()
self._ws_manager = ws_manager self._ws_manager = ws_manager
self._state_lock = state_lock or asyncio.Lock() self._state_lock = state_lock or asyncio.Lock()
self._stop_requested = False
def _resolve_output_dir(self, library_name: str | None = None) -> str: def _resolve_output_dir(self, library_name: str | None = None) -> str:
base_path = get_settings_manager().get('example_images_path') base_path = get_settings_manager().get('example_images_path')
@@ -145,6 +146,7 @@ class DownloadManager:
raise DownloadConfigurationError('Example images path not configured in settings') raise DownloadConfigurationError('Example images path not configured in settings')
self._progress.reset() self._progress.reset()
self._stop_requested = False
self._progress['status'] = 'running' self._progress['status'] = 'running'
self._progress['start_time'] = time.time() self._progress['start_time'] = time.time()
self._progress['end_time'] = None self._progress['end_time'] = None
@@ -268,6 +270,27 @@ class DownloadManager:
'message': 'Download resumed' 'message': 'Download resumed'
} }
async def stop_download(self, request):
"""Stop the example images download after the current model completes."""
async with self._state_lock:
if not self._is_downloading:
raise DownloadNotRunningError()
if self._progress['status'] in {'completed', 'error', 'stopped'}:
raise DownloadNotRunningError()
if self._progress['status'] != 'stopping':
self._stop_requested = True
self._progress['status'] = 'stopping'
await self._broadcast_progress(status='stopping')
return {
'success': True,
'message': 'Download stopping'
}
async def _download_all_example_images( async def _download_all_example_images(
self, self,
output_dir, output_dir,
@@ -311,6 +334,12 @@ class DownloadManager:
# Process each model # Process each model
for i, (scanner_type, model, scanner) in enumerate(all_models): for i, (scanner_type, model, scanner) in enumerate(all_models):
async with self._state_lock:
current_status = self._progress['status']
if current_status not in {'running', 'paused', 'stopping'}:
break
# Main logic for processing model is here, but actual operations are delegated to other classes # Main logic for processing model is here, but actual operations are delegated to other classes
was_remote_download = await self._process_model( was_remote_download = await self._process_model(
scanner_type, scanner_type,
@@ -324,21 +353,56 @@ class DownloadManager:
# Update progress # Update progress
self._progress['completed'] += 1 self._progress['completed'] += 1
await self._broadcast_progress(status='running')
async with self._state_lock:
current_status = self._progress['status']
should_stop = self._stop_requested and current_status == 'stopping'
broadcast_status = 'running' if current_status == 'running' else current_status
await self._broadcast_progress(status=broadcast_status)
if should_stop:
break
# Only add delay after remote download of models, and not after processing the last model # Only add delay after remote download of models, and not after processing the last model
if was_remote_download and i < len(all_models) - 1 and self._progress['status'] == 'running': if (
was_remote_download
and i < len(all_models) - 1
and current_status == 'running'
):
await asyncio.sleep(delay) await asyncio.sleep(delay)
# Mark as completed async with self._state_lock:
self._progress['status'] = 'completed' if self._stop_requested and self._progress['status'] == 'stopping':
self._progress['end_time'] = time.time() self._progress['status'] = 'stopped'
logger.debug( self._progress['end_time'] = time.time()
"Example images download completed: %s/%s models processed", self._stop_requested = False
self._progress['completed'], final_status = 'stopped'
self._progress['total'], elif self._progress['status'] not in {'error', 'stopped'}:
) self._progress['status'] = 'completed'
await self._broadcast_progress(status='completed') self._progress['end_time'] = time.time()
self._stop_requested = False
final_status = 'completed'
else:
final_status = self._progress['status']
self._stop_requested = False
if self._progress['end_time'] is None:
self._progress['end_time'] = time.time()
if final_status == 'completed':
logger.debug(
"Example images download completed: %s/%s models processed",
self._progress['completed'],
self._progress['total'],
)
elif final_status == 'stopped':
logger.debug(
"Example images download stopped: %s/%s models processed",
self._progress['completed'],
self._progress['total'],
)
await self._broadcast_progress(status=final_status)
except Exception as e: except Exception as e:
error_msg = f"Error during example images download: {str(e)}" error_msg = f"Error during example images download: {str(e)}"
@@ -360,6 +424,7 @@ class DownloadManager:
async with self._state_lock: async with self._state_lock:
self._is_downloading = False self._is_downloading = False
self._download_task = None self._download_task = None
self._stop_requested = False
async def _process_model( async def _process_model(
self, self,
@@ -378,7 +443,7 @@ class DownloadManager:
await asyncio.sleep(1) await asyncio.sleep(1)
# Check if download should continue # Check if download should continue
if self._progress['status'] != 'running': if self._progress['status'] not in {'running', 'stopping'}:
logger.info(f"Download stopped: {self._progress['status']}") logger.info(f"Download stopped: {self._progress['status']}")
return False # Return False to indicate no remote download happened return False # Return False to indicate no remote download happened
@@ -567,6 +632,7 @@ class DownloadManager:
raise DownloadConfigurationError('Example images path not configured in settings') raise DownloadConfigurationError('Example images path not configured in settings')
self._progress.reset() self._progress.reset()
self._stop_requested = False
self._progress['total'] = len(model_hashes) self._progress['total'] = len(model_hashes)
self._progress['status'] = 'running' self._progress['status'] = 'running'
self._progress['start_time'] = time.time() self._progress['start_time'] = time.time()
@@ -588,10 +654,15 @@ class DownloadManager:
async with self._state_lock: async with self._state_lock:
self._is_downloading = False self._is_downloading = False
final_status = self._progress['status']
message = 'Force download completed'
if final_status == 'stopped':
message = 'Force download stopped'
return { return {
'success': True, 'success': True,
'message': 'Force download completed', 'message': message,
'result': result 'result': result
} }
@@ -649,6 +720,12 @@ class DownloadManager:
# Process each model # Process each model
success_count = 0 success_count = 0
for i, (scanner_type, model, scanner) in enumerate(models_to_process): for i, (scanner_type, model, scanner) in enumerate(models_to_process):
async with self._state_lock:
current_status = self._progress['status']
if current_status not in {'running', 'paused', 'stopping'}:
break
# Force process this model regardless of previous status # Force process this model regardless of previous status
was_successful = await self._process_specific_model( was_successful = await self._process_specific_model(
scanner_type, scanner_type,
@@ -666,24 +743,57 @@ class DownloadManager:
# Update progress # Update progress
self._progress['completed'] += 1 self._progress['completed'] += 1
async with self._state_lock:
current_status = self._progress['status']
should_stop = self._stop_requested and current_status == 'stopping'
broadcast_status = 'running' if current_status == 'running' else current_status
# Send progress update via WebSocket # Send progress update via WebSocket
await self._broadcast_progress(status='running') await self._broadcast_progress(status=broadcast_status)
if should_stop:
break
# Only add delay after remote download, and not after processing the last model # Only add delay after remote download, and not after processing the last model
if was_successful and i < len(models_to_process) - 1 and self._progress['status'] == 'running': if (
was_successful
and i < len(models_to_process) - 1
and current_status == 'running'
):
await asyncio.sleep(delay) await asyncio.sleep(delay)
# Mark as completed async with self._state_lock:
self._progress['status'] = 'completed' if self._stop_requested and self._progress['status'] == 'stopping':
self._progress['end_time'] = time.time() self._progress['status'] = 'stopped'
logger.debug( self._progress['end_time'] = time.time()
"Forced example images download completed: %s/%s models processed", self._stop_requested = False
self._progress['completed'], final_status = 'stopped'
self._progress['total'], elif self._progress['status'] not in {'error', 'stopped'}:
) self._progress['status'] = 'completed'
self._progress['end_time'] = time.time()
self._stop_requested = False
final_status = 'completed'
else:
final_status = self._progress['status']
self._stop_requested = False
if self._progress['end_time'] is None:
self._progress['end_time'] = time.time()
if final_status == 'completed':
logger.debug(
"Forced example images download completed: %s/%s models processed",
self._progress['completed'],
self._progress['total'],
)
elif final_status == 'stopped':
logger.debug(
"Forced example images download stopped: %s/%s models processed",
self._progress['completed'],
self._progress['total'],
)
# Send final progress via WebSocket # Send final progress via WebSocket
await self._broadcast_progress(status='completed') await self._broadcast_progress(status=final_status)
return { return {
'total': self._progress['total'], 'total': self._progress['total'],
@@ -726,7 +836,7 @@ class DownloadManager:
await asyncio.sleep(1) await asyncio.sleep(1)
# Check if download should continue # Check if download should continue
if self._progress['status'] != 'running': if self._progress['status'] not in {'running', 'stopping'}:
logger.info(f"Download stopped: {self._progress['status']}") logger.info(f"Download stopped: {self._progress['status']}")
return False return False

View File

@@ -270,7 +270,12 @@ class MetadataUpdater:
""" """
try: try:
await MetadataManager.hydrate_model_data(model_data) await MetadataManager.hydrate_model_data(model_data)
civitai_data = model_data.setdefault('civitai', {}) civitai_data = model_data.get('civitai')
if not isinstance(civitai_data, dict):
civitai_data = {}
model_data['civitai'] = civitai_data
custom_images = civitai_data.get('customImages') custom_images = civitai_data.get('customImages')
if not isinstance(custom_images, list): if not isinstance(custom_images, list):

View File

@@ -18,18 +18,22 @@ class BaseModelMetadata:
preview_nsfw_level: int = 0 # NSFW level of the preview image preview_nsfw_level: int = 0 # NSFW level of the preview image
notes: str = "" # Additional notes notes: str = "" # Additional notes
from_civitai: bool = True # Whether from Civitai from_civitai: bool = True # Whether from Civitai
civitai: Optional[Dict] = None # Civitai API data if available civitai: Dict[str, Any] = field(default_factory=dict) # Civitai API data if available
tags: List[str] = None # Model tags tags: List[str] = None # Model tags
modelDescription: str = "" # Full model description modelDescription: str = "" # Full model description
civitai_deleted: bool = False # Whether deleted from Civitai civitai_deleted: bool = False # Whether deleted from Civitai
favorite: bool = False # Whether the model is a favorite favorite: bool = False # Whether the model is a favorite
exclude: bool = False # Whether to exclude this model from the cache exclude: bool = False # Whether to exclude this model from the cache
db_checked: bool = False # Whether checked in archive DB db_checked: bool = False # Whether checked in archive DB
metadata_source: Optional[str] = None # Last provider that supplied metadata
last_checked_at: float = 0 # Last checked timestamp last_checked_at: float = 0 # Last checked timestamp
_unknown_fields: Dict[str, Any] = field(default_factory=dict, repr=False, compare=False) # Store unknown fields _unknown_fields: Dict[str, Any] = field(default_factory=dict, repr=False, compare=False) # Store unknown fields
def __post_init__(self): def __post_init__(self):
# Initialize empty lists to avoid mutable default parameter issue # Initialize empty lists to avoid mutable default parameter issue
if self.civitai is None:
self.civitai = {}
if self.tags is None: if self.tags is None:
self.tags = [] self.tags = []

104
py/utils/tag_priorities.py Normal file
View File

@@ -0,0 +1,104 @@
"""Helpers for parsing and resolving priority tag configurations."""
from __future__ import annotations
from dataclasses import dataclass
from typing import Dict, Iterable, List, Optional, Sequence, Set
@dataclass(frozen=True)
class PriorityTagEntry:
"""A parsed priority tag configuration entry."""
canonical: str
aliases: Set[str]
@property
def normalized_aliases(self) -> Set[str]:
return {alias.lower() for alias in self.aliases}
def _normalize_alias(alias: str) -> str:
return alias.strip()
def parse_priority_tag_string(config: str | None) -> List[PriorityTagEntry]:
"""Parse the user-facing priority tag string into structured entries."""
if not config:
return []
entries: List[PriorityTagEntry] = []
seen_canonicals: Set[str] = set()
for raw_entry in _split_priority_entries(config):
canonical, aliases = _parse_priority_entry(raw_entry)
if not canonical:
continue
normalized_canonical = canonical.lower()
if normalized_canonical in seen_canonicals:
# Skip duplicate canonicals while preserving first occurrence priority
continue
seen_canonicals.add(normalized_canonical)
alias_set = {canonical, *aliases}
cleaned_aliases = {_normalize_alias(alias) for alias in alias_set if _normalize_alias(alias)}
if not cleaned_aliases:
continue
entries.append(PriorityTagEntry(canonical=canonical, aliases=cleaned_aliases))
return entries
def _split_priority_entries(config: str) -> List[str]:
# Split on commas while respecting that users may add new lines for readability
parts = []
for chunk in config.split('\n'):
parts.extend(chunk.split(','))
return [part.strip() for part in parts if part.strip()]
def _parse_priority_entry(entry: str) -> tuple[str, Set[str]]:
if '(' in entry and entry.endswith(')'):
canonical, raw_aliases = entry.split('(', 1)
canonical = canonical.strip()
alias_section = raw_aliases[:-1] # drop trailing ')'
aliases = {alias.strip() for alias in alias_section.split('|') if alias.strip()}
return canonical, aliases
if '(' in entry and not entry.endswith(')'):
# Malformed entry; treat as literal canonical to avoid surprises
entry = entry.replace('(', '').replace(')', '')
canonical = entry.strip()
return canonical, set()
def resolve_priority_tag(
tags: Sequence[str] | Iterable[str],
entries: Sequence[PriorityTagEntry],
) -> Optional[str]:
"""Resolve the first matching canonical priority tag for the provided tags."""
tag_lookup: Dict[str, str] = {}
for tag in tags:
if not isinstance(tag, str):
continue
normalized = tag.lower()
if normalized not in tag_lookup:
tag_lookup[normalized] = tag
for entry in entries:
for alias in entry.normalized_aliases:
if alias in tag_lookup:
return entry.canonical
return None
def collect_canonical_tags(entries: Iterable[PriorityTagEntry]) -> List[str]:
"""Return the ordered list of canonical tags from the parsed entries."""
return [entry.canonical for entry in entries]

View File

@@ -4,7 +4,6 @@ from typing import Dict
from ..services.service_registry import ServiceRegistry from ..services.service_registry import ServiceRegistry
from ..config import config from ..config import config
from ..services.settings_manager import get_settings_manager from ..services.settings_manager import get_settings_manager
from .constants import CIVITAI_MODEL_TAGS
import asyncio import asyncio
def get_lora_info(lora_name): def get_lora_info(lora_name):
@@ -170,16 +169,7 @@ def calculate_relative_path_for_model(model_data: Dict, model_type: str = 'lora'
base_model_mappings = settings_manager.get('base_model_path_mappings', {}) base_model_mappings = settings_manager.get('base_model_path_mappings', {})
mapped_base_model = base_model_mappings.get(base_model, base_model) mapped_base_model = base_model_mappings.get(base_model, base_model)
# Find the first Civitai model tag that exists in model_tags first_tag = settings_manager.resolve_priority_tag_for_model(model_tags, model_type)
first_tag = ''
for civitai_tag in CIVITAI_MODEL_TAGS:
if civitai_tag in model_tags:
first_tag = civitai_tag
break
# If no Civitai model tag found, fallback to first tag
if not first_tag and model_tags:
first_tag = model_tags[0]
if not first_tag: if not first_tag:
first_tag = 'no tags' # Default if no tags available first_tag = 'no tags' # Default if no tags available

View File

@@ -1,7 +1,7 @@
[project] [project]
name = "comfyui-lora-manager" name = "comfyui-lora-manager"
description = "Revolutionize your workflow with the ultimate LoRA companion for ComfyUI!" description = "Revolutionize your workflow with the ultimate LoRA companion for ComfyUI!"
version = "0.9.7" version = "0.9.8"
license = {file = "LICENSE"} license = {file = "LICENSE"}
dependencies = [ dependencies = [
"aiohttp", "aiohttp",

View File

@@ -0,0 +1,134 @@
{
"id": 1746460,
"name": "Mixplin Style [Illustrious]",
"type": "LORA",
"description": "description",
"username": "Ty_Lee",
"downloadCount": 4207,
"favoriteCount": 0,
"commentCount": 8,
"ratingCount": 0,
"rating": 0,
"is_nsfw": true,
"nsfw_level": 31,
"createdAt": "2025-07-06T01:51:42.859Z",
"updatedAt": "2025-10-10T23:15:26.714Z",
"deletedAt": null,
"tags": [
"art",
"style",
"artist style",
"styles",
"mixplin",
"artiststyle"
],
"creator_id": "Ty_Lee",
"creator_username": "Ty_Lee",
"creator_name": "Ty_Lee",
"creator_url": "/users/Ty_Lee",
"versions": [
{
"id": 2042594,
"name": "v2.0",
"href": "/models/1746460?modelVersionId=2042594"
},
{
"id": 1976567,
"name": "v1.0",
"href": "/models/1746460?modelVersionId=1976567"
}
],
"version": {
"id": 1976567,
"modelId": 1746460,
"name": "v1.0",
"baseModel": "Illustrious",
"baseModelType": "Standard",
"description": null,
"downloadCount": 437,
"ratingCount": 0,
"rating": 0,
"is_nsfw": true,
"nsfw_level": 31,
"createdAt": "2025-07-05T10:17:28.716Z",
"updatedAt": "2025-10-10T23:15:26.756Z",
"deletedAt": null,
"files": [
{
"id": 1874043,
"name": "mxpln-illustrious-ty_lee.safetensors",
"type": "Model",
"sizeKB": 223124.37109375,
"downloadUrl": "https://civitai.com/api/download/models/1976567",
"modelId": 1746460,
"modelName": "Mixplin Style [Illustrious]",
"modelVersionId": 1976567,
"is_nsfw": true,
"nsfw_level": 31,
"sha256": "e2b7a280d6539556f23f380b3f71e4e22bc4524445c4c96526e117c6005c6ad3",
"createdAt": "2025-07-05T10:17:28.716Z",
"updatedAt": "2025-10-10T23:15:26.766Z",
"is_primary": false,
"mirrors": [
{
"filename": "mxpln-illustrious-ty_lee.safetensors",
"url": "https://civitai.com/api/download/models/1976567",
"source": "civitai",
"model_id": 1746460,
"model_version_id": 1976567,
"deletedAt": null,
"is_gated": false,
"is_paid": false
}
]
}
],
"images": [
{
"id": 86403595,
"url": "https://img.genur.art/sig/width:450/quality:85/aHR0cHM6Ly9jLmdlbnVyLmFydC9hNmE3Njc2YS0wMWQ3LTQ1YzAtOWEzYS1mNWJiYTU4MDNiMDE=",
"nsfwLevel": 1,
"width": 1560,
"height": 2280,
"hash": "U7G8Zp0w02%IA6%N00-;D]-W~VNG0nMw-.IV",
"type": "image",
"minor": false,
"poi": false,
"hasMeta": true,
"hasPositivePrompt": true,
"onSite": false,
"remixOfId": null,
"image_url": "https://img.genur.art/sig/width:450/quality:85/aHR0cHM6Ly9jLmdlbnVyLmFydC9hNmE3Njc2YS0wMWQ3LTQ1YzAtOWEzYS1mNWJiYTU4MDNiMDE=",
"link": "https://genur.art/posts/86403595"
}
],
"trigger": [
"mxpln"
],
"allow_download": true,
"download_url": "/api/download/models/1976567",
"platform_url": "https://civitai.com/models/1746460?modelVersionId=1976567",
"civitai_model_id": 1746460,
"civitai_model_version_id": 1976567,
"href": "/models/1746460?modelVersionId=1976567",
"mirrors": [
{
"platform": "tensorart",
"href": "/tensorart/models/904473536033245448/versions/904473536033245448",
"platform_url": "https://tensor.art/models/904473536033245448",
"name": "Mixplin Style MXP",
"version_name": "Mixplin",
"id": "904473536033245448",
"version_id": "904473536033245448"
}
]
},
"platform": "civitai",
"platform_name": "CivitAI",
"meta": {
"title": "Mixplin Style [Illustrious] - v1.0 - CivitAI Archive",
"description": "Mixplin Style [Illustrious] v1.0 is a Illustrious LORA AI model created by Ty_Lee for generating images of art, style, artist style, styles, mixplin, artiststyle",
"image": "https://img.genur.art/sig/width:450/quality:85/aHR0cHM6Ly9jLmdlbnVyLmFydC9hNmE3Njc2YS0wMWQ3LTQ1YzAtOWEzYS1mNWJiYTU4MDNiMDE=",
"canonical": "https://civarchive.com/models/1746460?modelVersionId=1976567"
}
}

38
refs/target_version.json Normal file
View File

@@ -0,0 +1,38 @@
{
"id": 2269146,
"modelId": 2004760,
"name": "v1.0 Illustrious",
"nsfwLevel": 1,
"trainedWords": ["PencilSketchDaal"],
"baseModel": "Illustrious",
"description": "<p>Illustrious. Your pencil may vary with your checkpoint. </p>",
"model": {
"name": "Pencil Sketch Anime",
"type": "LORA",
"nsfw": false,
"description": "description",
"tags": ["style"]
},
"files": [
{
"id": 2161260,
"sizeKB": 223106.37890625,
"name": "Pencil-Sketch-Illustrious.safetensors",
"type": "Model",
"hashes": {
"SHA256": "2C70479CD673B0FE056EAF4FD97C7F33A39F14853805431AC9AB84226ECE3B82"
},
"primary": true,
"downloadUrl": "https://civitai.com/api/download/models/2269146",
"mirrors": {}
}
],
"images": [
{},
{}
],
"creator": {
"username": "Daalis",
"image": "https://image.civitai.com/xG1nkqKTMzGDvpLrqFT7WA/eb245b49-edc8-4ed6-ad7b-6d61eb8c51de/width=96/Daalis.jpeg"
}
}

View File

@@ -103,6 +103,23 @@
opacity: 0.7; opacity: 0.7;
} }
.download-transfer-stats {
margin-top: var(--space-2);
font-size: 0.85rem;
color: var(--text-color);
display: flex;
flex-direction: column;
gap: var(--space-1);
}
.download-transfer-stats .download-transfer-bytes,
.download-transfer-stats .download-transfer-speed {
display: flex;
align-items: center;
gap: var(--space-1);
white-space: nowrap;
}
@keyframes spin { @keyframes spin {
0% { transform: rotate(0deg); } 0% { transform: rotate(0deg); }
100% { transform: rotate(360deg); } 100% { transform: rotate(360deg); }

View File

@@ -35,34 +35,38 @@
margin: 0; margin: 0;
} }
.settings-open-location-button { .settings-action-link {
display: inline-flex; display: inline-flex;
align-items: center; align-items: center;
justify-content: center; justify-content: center;
width: 28px; width: 28px;
height: 28px; height: 28px;
border: none; border-radius: var(--border-radius-xs);
border: 1px solid transparent;
background: none; background: none;
color: var(--text-color); color: var(--text-color);
opacity: 0.6; opacity: 0.6;
cursor: pointer; cursor: pointer;
border-radius: var(--border-radius-xs); text-decoration: none;
transition: opacity 0.2s ease, background-color 0.2s ease; line-height: 1;
transition: opacity 0.2s ease, background-color 0.2s ease, color 0.2s ease, border-color 0.2s ease;
} }
.settings-open-location-button:hover, .settings-action-link:hover,
.settings-open-location-button:focus-visible { .settings-action-link:focus-visible {
opacity: 1; opacity: 1;
color: var(--lora-accent);
background-color: rgba(var(--border-color-rgb, 148, 163, 184), 0.2); background-color: rgba(var(--border-color-rgb, 148, 163, 184), 0.2);
border-color: rgba(var(--border-color-rgb, 148, 163, 184), 0.4);
outline: none; outline: none;
} }
.settings-open-location-button i { .settings-action-link i {
font-size: 1em; font-size: 1em;
} }
.settings-open-location-button:focus-visible { .settings-action-link:focus-visible {
box-shadow: 0 0 0 2px rgba(var(--border-color-rgb, 148, 163, 184), 0.6); box-shadow: 0 0 0 2px rgba(var(--lora-accent-rgb, 79, 70, 229), 0.2);
} }
/* Settings Links */ /* Settings Links */
@@ -204,6 +208,141 @@
width: 100%; /* Full width */ width: 100%; /* Full width */
} }
.settings-help-text {
font-size: 0.9em;
color: var(--text-color);
opacity: 0.8;
margin-bottom: var(--space-2);
line-height: 1.4;
}
.settings-help-text.subtle {
font-size: 0.85em;
opacity: 0.7;
margin-top: var(--space-1);
}
.priority-tags-input {
width: 97%;
min-height: 72px;
padding: 8px;
border-radius: var(--border-radius-xs);
border: 1px solid var(--border-color);
background-color: var(--lora-surface);
color: var(--text-color);
resize: vertical;
}
.priority-tags-input:focus {
border-color: var(--lora-accent);
outline: none;
box-shadow: 0 0 0 2px rgba(var(--lora-accent-rgb, 79, 70, 229), 0.1);
}
.priority-tags-item {
gap: var(--space-2);
}
.priority-tags-header {
align-items: center;
}
.priority-tags-actions {
display: flex;
align-items: center;
justify-content: flex-end;
}
.priority-tags-example {
font-size: 0.85em;
opacity: 0.8;
margin-top: var(--space-1);
}
.priority-tags-example code {
font-family: var(--code-font, ui-monospace, SFMono-Regular, Menlo, Monaco, Consolas, 'Liberation Mono', 'Courier New', monospace);
background-color: rgba(var(--lora-accent-rgb, 79, 70, 229), 0.12);
padding: 2px 6px;
border-radius: var(--border-radius-xs);
display: inline-block;
}
.priority-tags-tabs {
position: relative;
}
.priority-tags-tab-input {
position: absolute;
opacity: 0;
pointer-events: none;
}
.priority-tags-tablist {
display: flex;
gap: var(--space-1);
border-bottom: 1px solid var(--border-color);
padding-bottom: var(--space-1);
}
.priority-tags-tab-label {
flex: 1;
text-align: center;
padding: var(--space-1) var(--space-2);
border: none;
border-bottom: 2px solid transparent;
color: var(--text-color);
cursor: pointer;
transition: all 0.2s ease;
opacity: 0.7;
}
.priority-tags-tab-label:hover,
.priority-tags-tab-label:focus {
opacity: 1;
color: var(--lora-accent);
background: oklch(var(--lora-accent-l) var(--lora-accent-c) var(--lora-accent-h) / 0.05);
}
.priority-tags-panels {
margin-top: var(--space-2);
}
.priority-tags-panel {
display: none;
}
#priority-tags-tab-lora:checked ~ .priority-tags-tablist label[for="priority-tags-tab-lora"],
#priority-tags-tab-checkpoint:checked ~ .priority-tags-tablist label[for="priority-tags-tab-checkpoint"],
#priority-tags-tab-embedding:checked ~ .priority-tags-tablist label[for="priority-tags-tab-embedding"] {
border-bottom-color: var(--lora-accent);
color: var(--lora-accent);
opacity: 1;
font-weight: 600;
}
#priority-tags-tab-lora:checked ~ .priority-tags-panels #priority-tags-panel-lora,
#priority-tags-tab-checkpoint:checked ~ .priority-tags-panels #priority-tags-panel-checkpoint,
#priority-tags-tab-embedding:checked ~ .priority-tags-panels #priority-tags-panel-embedding {
display: block;
}
.priority-tags-input.settings-input-error {
border-color: var(--danger-color, #dc2626);
box-shadow: 0 0 0 2px rgba(220, 38, 38, 0.12);
}
.settings-input-error-message {
font-size: 0.8em;
color: var(--danger-color, #dc2626);
display: none;
}
.metadata-suggestions-loading {
font-size: 0.85em;
opacity: 0.7;
padding: 6px 0;
}
/* Settings Styles */ /* Settings Styles */
.settings-section { .settings-section {
margin-top: var(--space-3); margin-top: var(--space-3);

View File

@@ -92,6 +92,39 @@
border-radius: var(--border-radius-xs); border-radius: var(--border-radius-xs);
padding: 4px 8px; padding: 4px 8px;
position: relative; position: relative;
cursor: grab;
transition: transform 0.18s ease;
}
.metadata-item:active {
cursor: grabbing;
}
.metadata-item-dragging {
box-shadow: 0 10px 24px rgba(0, 0, 0, 0.25);
cursor: grabbing;
opacity: 0.95;
transition: none;
}
.metadata-item-placeholder {
border: 1px dashed var(--lora-accent);
border-radius: var(--border-radius-xs);
background: rgba(255, 255, 255, 0.1);
pointer-events: none;
}
.metadata-items-sorting .metadata-item {
transition: transform 0.18s ease;
}
body.metadata-drag-active {
user-select: none;
cursor: grabbing;
}
body.metadata-drag-active * {
cursor: grabbing !important;
} }
.metadata-item-content { .metadata-item-content {

View File

@@ -184,6 +184,19 @@
font-weight: 500; font-weight: 500;
} }
.sidebar-tree-node-content.drop-target,
.sidebar-node-content.drop-target {
background: oklch(var(--lora-accent-l) var(--lora-accent-c) var(--lora-accent-h) / 0.15);
color: var(--lora-accent);
border-left-color: var(--lora-accent);
}
.sidebar-tree-node-content.drop-target .sidebar-tree-folder-icon,
.sidebar-node-content.drop-target .sidebar-folder-icon {
color: var(--lora-accent);
opacity: 1;
}
.sidebar-tree-expand-icon { .sidebar-tree-expand-icon {
width: 16px; width: 16px;
height: 16px; height: 16px;

View File

@@ -569,9 +569,15 @@ export class BaseModelApiClient {
} }
} }
async fetchCivitaiVersions(modelId) { async fetchCivitaiVersions(modelId, source = null) {
try { try {
const response = await fetch(`${this.apiConfig.endpoints.civitaiVersions}/${modelId}`); let requestUrl = `${this.apiConfig.endpoints.civitaiVersions}/${modelId}`;
if (source) {
const params = new URLSearchParams({ source });
requestUrl = `${requestUrl}?${params.toString()}`;
}
const response = await fetch(requestUrl);
if (!response.ok) { if (!response.ok) {
const errorData = await response.json().catch(() => ({})); const errorData = await response.json().catch(() => ({}));
if (errorData && errorData.error && errorData.error.includes('Model type mismatch')) { if (errorData && errorData.error && errorData.error.includes('Model type mismatch')) {
@@ -639,7 +645,7 @@ export class BaseModelApiClient {
} }
} }
async downloadModel(modelId, versionId, modelRoot, relativePath, useDefaultPaths = false, downloadId) { async downloadModel(modelId, versionId, modelRoot, relativePath, useDefaultPaths = false, downloadId, source = null) {
try { try {
const response = await fetch(DOWNLOAD_ENDPOINTS.download, { const response = await fetch(DOWNLOAD_ENDPOINTS.download, {
method: 'POST', method: 'POST',
@@ -650,7 +656,8 @@ export class BaseModelApiClient {
model_root: modelRoot, model_root: modelRoot,
relative_path: relativePath, relative_path: relativePath,
use_default_paths: useDefaultPaths, use_default_paths: useDefaultPaths,
download_id: downloadId download_id: downloadId,
...(source ? { source } : {})
}) })
}); });

View File

@@ -1,17 +1,17 @@
import { LoraApiClient } from './loraApi.js'; import { LoraApiClient } from './loraApi.js';
import { CheckpointApiClient } from './checkpointApi.js'; import { CheckpointApiClient } from './checkpointApi.js';
import { EmbeddingApiClient } from './embeddingApi.js'; import { EmbeddingApiClient } from './embeddingApi.js';
import { MODEL_TYPES } from './apiConfig.js'; import { MODEL_TYPES, isValidModelType } from './apiConfig.js';
import { state } from '../state/index.js'; import { state } from '../state/index.js';
export function createModelApiClient(modelType) { export function createModelApiClient(modelType) {
switch (modelType) { switch (modelType) {
case MODEL_TYPES.LORA: case MODEL_TYPES.LORA:
return new LoraApiClient(); return new LoraApiClient(MODEL_TYPES.LORA);
case MODEL_TYPES.CHECKPOINT: case MODEL_TYPES.CHECKPOINT:
return new CheckpointApiClient(); return new CheckpointApiClient(MODEL_TYPES.CHECKPOINT);
case MODEL_TYPES.EMBEDDING: case MODEL_TYPES.EMBEDDING:
return new EmbeddingApiClient(); return new EmbeddingApiClient(MODEL_TYPES.EMBEDDING);
default: default:
throw new Error(`Unsupported model type: ${modelType}`); throw new Error(`Unsupported model type: ${modelType}`);
} }
@@ -20,7 +20,13 @@ export function createModelApiClient(modelType) {
let _singletonClients = new Map(); let _singletonClients = new Map();
export function getModelApiClient(modelType = null) { export function getModelApiClient(modelType = null) {
const targetType = modelType || state.currentPageType; let targetType = modelType;
if (!isValidModelType(targetType)) {
targetType = isValidModelType(state.currentPageType)
? state.currentPageType
: MODEL_TYPES.LORA;
}
if (!_singletonClients.has(targetType)) { if (!_singletonClients.has(targetType)) {
_singletonClients.set(targetType, createModelApiClient(targetType)); _singletonClients.set(targetType, createModelApiClient(targetType));

View File

@@ -4,6 +4,9 @@
import { getStorageItem, setStorageItem } from '../utils/storageHelpers.js'; import { getStorageItem, setStorageItem } from '../utils/storageHelpers.js';
import { getModelApiClient } from '../api/modelApiFactory.js'; import { getModelApiClient } from '../api/modelApiFactory.js';
import { translate } from '../utils/i18nHelpers.js'; import { translate } from '../utils/i18nHelpers.js';
import { state } from '../state/index.js';
import { bulkManager } from '../managers/BulkManager.js';
import { showToast } from '../utils/uiHelpers.js';
export class SidebarManager { export class SidebarManager {
constructor() { constructor() {
@@ -22,6 +25,12 @@ export class SidebarManager {
this.displayMode = 'tree'; // 'tree' or 'list' this.displayMode = 'tree'; // 'tree' or 'list'
this.foldersList = []; this.foldersList = [];
this.recursiveSearchEnabled = true; this.recursiveSearchEnabled = true;
this.draggedFilePaths = null;
this.draggedRootPath = null;
this.draggedFromBulk = false;
this.dragHandlersInitialized = false;
this.folderTreeElement = null;
this.currentDropTarget = null;
// Bind methods // Bind methods
this.handleTreeClick = this.handleTreeClick.bind(this); this.handleTreeClick = this.handleTreeClick.bind(this);
@@ -38,6 +47,12 @@ export class SidebarManager {
this.handleDisplayModeToggle = this.handleDisplayModeToggle.bind(this); this.handleDisplayModeToggle = this.handleDisplayModeToggle.bind(this);
this.handleFolderListClick = this.handleFolderListClick.bind(this); this.handleFolderListClick = this.handleFolderListClick.bind(this);
this.handleRecursiveToggle = this.handleRecursiveToggle.bind(this); this.handleRecursiveToggle = this.handleRecursiveToggle.bind(this);
this.handleCardDragStart = this.handleCardDragStart.bind(this);
this.handleCardDragEnd = this.handleCardDragEnd.bind(this);
this.handleFolderDragEnter = this.handleFolderDragEnter.bind(this);
this.handleFolderDragOver = this.handleFolderDragOver.bind(this);
this.handleFolderDragLeave = this.handleFolderDragLeave.bind(this);
this.handleFolderDrop = this.handleFolderDrop.bind(this);
} }
async initialize(pageControls) { async initialize(pageControls) {
@@ -54,6 +69,7 @@ export class SidebarManager {
this.setInitialSidebarState(); this.setInitialSidebarState();
this.setupEventHandlers(); this.setupEventHandlers();
this.initializeDragAndDrop();
this.updateSidebarTitle(); this.updateSidebarTitle();
this.restoreSidebarState(); this.restoreSidebarState();
await this.loadFolderTree(); await this.loadFolderTree();
@@ -81,6 +97,21 @@ export class SidebarManager {
// Clean up event handlers // Clean up event handlers
this.removeEventHandlers(); this.removeEventHandlers();
this.clearAllDropHighlights();
if (this.dragHandlersInitialized) {
document.removeEventListener('dragstart', this.handleCardDragStart);
document.removeEventListener('dragend', this.handleCardDragEnd);
this.dragHandlersInitialized = false;
}
if (this.folderTreeElement) {
this.folderTreeElement.removeEventListener('dragenter', this.handleFolderDragEnter);
this.folderTreeElement.removeEventListener('dragover', this.handleFolderDragOver);
this.folderTreeElement.removeEventListener('dragleave', this.handleFolderDragLeave);
this.folderTreeElement.removeEventListener('drop', this.handleFolderDrop);
this.folderTreeElement = null;
}
this.resetDragState();
// Reset state // Reset state
this.pageControls = null; this.pageControls = null;
this.pageType = null; this.pageType = null;
@@ -154,6 +185,271 @@ export class SidebarManager {
} }
} }
initializeDragAndDrop() {
if (!this.dragHandlersInitialized) {
document.addEventListener('dragstart', this.handleCardDragStart);
document.addEventListener('dragend', this.handleCardDragEnd);
this.dragHandlersInitialized = true;
}
const folderTree = document.getElementById('sidebarFolderTree');
if (folderTree && this.folderTreeElement !== folderTree) {
if (this.folderTreeElement) {
this.folderTreeElement.removeEventListener('dragenter', this.handleFolderDragEnter);
this.folderTreeElement.removeEventListener('dragover', this.handleFolderDragOver);
this.folderTreeElement.removeEventListener('dragleave', this.handleFolderDragLeave);
this.folderTreeElement.removeEventListener('drop', this.handleFolderDrop);
}
folderTree.addEventListener('dragenter', this.handleFolderDragEnter);
folderTree.addEventListener('dragover', this.handleFolderDragOver);
folderTree.addEventListener('dragleave', this.handleFolderDragLeave);
folderTree.addEventListener('drop', this.handleFolderDrop);
this.folderTreeElement = folderTree;
}
}
handleCardDragStart(event) {
const card = event.target.closest('.model-card');
if (!card) return;
const filePath = card.dataset.filepath;
if (!filePath) return;
const selectedSet = state.selectedModels instanceof Set
? state.selectedModels
: new Set(state.selectedModels || []);
const cardIsSelected = card.classList.contains('selected');
const usingBulkSelection = Boolean(state.bulkMode && cardIsSelected && selectedSet && selectedSet.size > 0);
const paths = usingBulkSelection ? Array.from(selectedSet) : [filePath];
const filePaths = Array.from(new Set(paths.filter(Boolean)));
if (filePaths.length === 0) {
return;
}
this.draggedFilePaths = filePaths;
this.draggedRootPath = this.getRootPathFromCard(card);
this.draggedFromBulk = usingBulkSelection;
const dataTransfer = event.dataTransfer;
if (dataTransfer) {
dataTransfer.effectAllowed = 'move';
dataTransfer.setData('text/plain', filePaths.join(','));
try {
dataTransfer.setData('application/json', JSON.stringify({ filePaths }));
} catch (error) {
// Ignore serialization errors
}
}
card.classList.add('dragging');
}
handleCardDragEnd(event) {
const card = event.target.closest('.model-card');
if (card) {
card.classList.remove('dragging');
}
this.clearAllDropHighlights();
this.resetDragState();
}
getRootPathFromCard(card) {
if (!card) return null;
const filePathRaw = card.dataset.filepath || '';
const normalizedFilePath = filePathRaw.replace(/\\/g, '/');
const lastSlashIndex = normalizedFilePath.lastIndexOf('/');
if (lastSlashIndex === -1) {
return null;
}
const directory = normalizedFilePath.substring(0, lastSlashIndex);
let folderValue = card.dataset.folder;
if (!folderValue || folderValue === 'undefined') {
folderValue = '';
}
const normalizedFolder = folderValue.replace(/\\/g, '/').replace(/^\/+|\/+$/g, '');
if (!normalizedFolder) {
return directory;
}
const suffix = `/${normalizedFolder}`;
if (directory.endsWith(suffix)) {
return directory.slice(0, -suffix.length);
}
return directory;
}
combineRootAndRelativePath(root, relative) {
const normalizedRoot = (root || '').replace(/\\/g, '/').replace(/\/+$/g, '');
const normalizedRelative = (relative || '').replace(/\\/g, '/').replace(/^\/+|\/+$/g, '');
if (!normalizedRoot) {
return normalizedRelative;
}
if (!normalizedRelative) {
return normalizedRoot;
}
return `${normalizedRoot}/${normalizedRelative}`;
}
getFolderElementFromEvent(event) {
const folderTree = this.folderTreeElement || document.getElementById('sidebarFolderTree');
if (!folderTree) return null;
const target = event.target instanceof Element ? event.target.closest('[data-path]') : null;
if (!target || !folderTree.contains(target)) {
return null;
}
return target;
}
setDropTargetHighlight(element, shouldAdd) {
if (!element) return;
let targetElement = element;
if (!targetElement.classList.contains('sidebar-tree-node-content') &&
!targetElement.classList.contains('sidebar-node-content')) {
targetElement = element.querySelector('.sidebar-tree-node-content, .sidebar-node-content');
}
if (targetElement) {
targetElement.classList.toggle('drop-target', shouldAdd);
}
}
handleFolderDragEnter(event) {
if (!this.draggedFilePaths || this.draggedFilePaths.length === 0) return;
const folderElement = this.getFolderElementFromEvent(event);
if (!folderElement) return;
event.preventDefault();
if (event.dataTransfer) {
event.dataTransfer.dropEffect = 'move';
}
this.setDropTargetHighlight(folderElement, true);
this.currentDropTarget = folderElement;
}
handleFolderDragOver(event) {
if (!this.draggedFilePaths || this.draggedFilePaths.length === 0) return;
const folderElement = this.getFolderElementFromEvent(event);
if (!folderElement) return;
event.preventDefault();
if (event.dataTransfer) {
event.dataTransfer.dropEffect = 'move';
}
}
handleFolderDragLeave(event) {
if (!this.draggedFilePaths || this.draggedFilePaths.length === 0) return;
const folderElement = this.getFolderElementFromEvent(event);
if (!folderElement) return;
const relatedTarget = event.relatedTarget instanceof Element ? event.relatedTarget : null;
if (!relatedTarget || !folderElement.contains(relatedTarget)) {
this.setDropTargetHighlight(folderElement, false);
if (this.currentDropTarget === folderElement) {
this.currentDropTarget = null;
}
}
}
async handleFolderDrop(event) {
if (!this.draggedFilePaths || this.draggedFilePaths.length === 0) return;
const folderElement = this.getFolderElementFromEvent(event);
if (!folderElement) return;
event.preventDefault();
event.stopPropagation();
this.setDropTargetHighlight(folderElement, false);
this.currentDropTarget = null;
const targetPath = folderElement.dataset.path || '';
await this.performDragMove(targetPath);
this.resetDragState();
this.clearAllDropHighlights();
}
async performDragMove(targetRelativePath) {
if (!this.draggedFilePaths || this.draggedFilePaths.length === 0) {
return false;
}
if (!this.apiClient) {
this.apiClient = getModelApiClient();
}
const rootPath = this.draggedRootPath ? this.draggedRootPath.replace(/\\/g, '/') : '';
if (!rootPath) {
showToast(
'toast.models.moveFailed',
{ message: translate('sidebar.dragDrop.unableToResolveRoot', {}, 'Unable to determine destination path for move.') },
'error'
);
return false;
}
const destination = this.combineRootAndRelativePath(rootPath, targetRelativePath);
const useBulkMove = this.draggedFromBulk || this.draggedFilePaths.length > 1;
try {
if (useBulkMove) {
await this.apiClient.moveBulkModels(this.draggedFilePaths, destination);
} else {
await this.apiClient.moveSingleModel(this.draggedFilePaths[0], destination);
}
if (this.pageControls && typeof this.pageControls.resetAndReload === 'function') {
await this.pageControls.resetAndReload(true);
} else {
await this.refresh();
}
if (this.draggedFromBulk && state.bulkMode && typeof bulkManager?.toggleBulkMode === 'function') {
bulkManager.toggleBulkMode();
}
return true;
} catch (error) {
console.error('Error moving model(s) via drag-and-drop:', error);
showToast('toast.models.moveFailed', { message: error.message || 'Unknown error' }, 'error');
return false;
}
}
resetDragState() {
this.draggedFilePaths = null;
this.draggedRootPath = null;
this.draggedFromBulk = false;
}
clearAllDropHighlights() {
const highlighted = document.querySelectorAll('.sidebar-tree-node-content.drop-target, .sidebar-node-content.drop-target');
highlighted.forEach((element) => element.classList.remove('drop-target'));
this.currentDropTarget = null;
}
async init() { async init() {
this.apiClient = getModelApiClient(); this.apiClient = getModelApiClient();
@@ -161,6 +457,7 @@ export class SidebarManager {
this.setInitialSidebarState(); this.setInitialSidebarState();
this.setupEventHandlers(); this.setupEventHandlers();
this.initializeDragAndDrop();
this.updateSidebarTitle(); this.updateSidebarTitle();
this.restoreSidebarState(); this.restoreSidebarState();
await this.loadFolderTree(); await this.loadFolderTree();
@@ -464,6 +761,7 @@ export class SidebarManager {
} else { } else {
this.renderFolderList(); this.renderFolderList();
} }
this.initializeDragAndDrop();
} }
renderTree() { renderTree() {
@@ -490,7 +788,7 @@ export class SidebarManager {
return ` return `
<div class="sidebar-tree-node" data-path="${currentPath}"> <div class="sidebar-tree-node" data-path="${currentPath}">
<div class="sidebar-tree-node-content ${isSelected ? 'selected' : ''}"> <div class="sidebar-tree-node-content ${isSelected ? 'selected' : ''}" data-path="${currentPath}">
<div class="sidebar-tree-expand-icon ${isExpanded ? 'expanded' : ''}" <div class="sidebar-tree-expand-icon ${isExpanded ? 'expanded' : ''}"
style="${hasChildren ? '' : 'opacity: 0; pointer-events: none;'}"> style="${hasChildren ? '' : 'opacity: 0; pointer-events: none;'}">
<i class="fas fa-chevron-right"></i> <i class="fas fa-chevron-right"></i>
@@ -535,7 +833,7 @@ export class SidebarManager {
return ` return `
<div class="sidebar-folder-item ${isSelected ? 'selected' : ''}" data-path="${folder}"> <div class="sidebar-folder-item ${isSelected ? 'selected' : ''}" data-path="${folder}">
<div class="sidebar-node-content"> <div class="sidebar-node-content" data-path="${folder}">
<i class="fas fa-folder sidebar-folder-icon"></i> <i class="fas fa-folder sidebar-folder-icon"></i>
<div class="sidebar-folder-name" title="${displayName}">${displayName}</div> <div class="sidebar-folder-name" title="${displayName}">${displayName}</div>
</div> </div>

View File

@@ -11,6 +11,17 @@ import { showDeleteModal } from '../../utils/modalUtils.js';
import { translate } from '../../utils/i18nHelpers.js'; import { translate } from '../../utils/i18nHelpers.js';
import { eventManager } from '../../utils/EventManager.js'; import { eventManager } from '../../utils/EventManager.js';
// Helper function to get display name based on settings
function getDisplayName(model) {
const displayNameSetting = state.global.settings.model_name_display || 'model_name';
if (displayNameSetting === 'file_name') {
return model.file_name || model.model_name || 'Unknown Model';
}
return model.model_name || model.file_name || 'Unknown Model';
}
// Add global event delegation handlers using event manager // Add global event delegation handlers using event manager
export function setupModelCardEventDelegation(modelType) { export function setupModelCardEventDelegation(modelType) {
// Remove any existing handler first // Remove any existing handler first
@@ -364,11 +375,12 @@ function showExampleAccessModal(card, modelType) {
export function createModelCard(model, modelType) { export function createModelCard(model, modelType) {
const card = document.createElement('div'); const card = document.createElement('div');
card.className = 'model-card'; // Reuse the same class for styling card.className = 'model-card'; // Reuse the same class for styling
card.draggable = true;
card.dataset.sha256 = model.sha256; card.dataset.sha256 = model.sha256;
card.dataset.filepath = model.file_path; card.dataset.filepath = model.file_path;
card.dataset.name = model.model_name; card.dataset.name = model.model_name;
card.dataset.file_name = model.file_name; card.dataset.file_name = model.file_name;
card.dataset.folder = model.folder; card.dataset.folder = model.folder || '';
card.dataset.modified = model.modified; card.dataset.modified = model.modified;
card.dataset.file_size = model.file_size; card.dataset.file_size = model.file_size;
card.dataset.from_civitai = model.from_civitai; card.dataset.from_civitai = model.from_civitai;
@@ -509,7 +521,7 @@ export function createModelCard(model, modelType) {
` : ''} ` : ''}
<div class="card-footer"> <div class="card-footer">
<div class="model-info"> <div class="model-info">
<span class="model-name">${model.model_name}</span> <span class="model-name">${getDisplayName(model)}</span>
${model.civitai?.name ? `<span class="version-name">${model.civitai.name}</span>` : ''} ${model.civitai?.name ? `<span class="version-name">${model.civitai.name}</span>` : ''}
</div> </div>
<div class="card-actions"> <div class="card-actions">

View File

@@ -236,7 +236,7 @@ export async function showModelModal(model, modelType) {
setupShowcaseScroll(modalId); setupShowcaseScroll(modalId);
setupTabSwitching(); setupTabSwitching();
setupTagTooltip(); setupTagTooltip();
setupTagEditMode(); setupTagEditMode(modelType);
setupModelNameEditing(modelWithFullData.file_path); setupModelNameEditing(modelWithFullData.file_path);
setupBaseModelEditing(modelWithFullData.file_path); setupBaseModelEditing(modelWithFullData.file_path);
setupFileNameEditing(modelWithFullData.file_path); setupFileNameEditing(modelWithFullData.file_path);

View File

@@ -4,7 +4,136 @@
*/ */
import { showToast } from '../../utils/uiHelpers.js'; import { showToast } from '../../utils/uiHelpers.js';
import { getModelApiClient } from '../../api/modelApiFactory.js'; import { getModelApiClient } from '../../api/modelApiFactory.js';
import { PRESET_TAGS } from '../../utils/constants.js'; import { translate } from '../../utils/i18nHelpers.js';
import { getPriorityTagSuggestions } from '../../utils/priorityTagHelpers.js';
import { state } from '../../state/index.js';
const MODEL_TYPE_SUGGESTION_KEY_MAP = {
loras: 'lora',
lora: 'lora',
checkpoints: 'checkpoint',
checkpoint: 'checkpoint',
embeddings: 'embedding',
embedding: 'embedding',
};
const METADATA_ITEM_SELECTOR = '.metadata-item';
const METADATA_ITEMS_CONTAINER_SELECTOR = '.metadata-items';
const METADATA_ITEM_DRAGGING_CLASS = 'metadata-item-dragging';
const METADATA_ITEM_PLACEHOLDER_CLASS = 'metadata-item-placeholder';
const METADATA_ITEMS_SORTING_CLASS = 'metadata-items-sorting';
const BODY_DRAGGING_CLASS = 'metadata-drag-active';
let activeModelTypeKey = '';
let priorityTagSuggestions = [];
let priorityTagSuggestionsLoaded = false;
let priorityTagSuggestionsPromise = null;
let activeTagDragState = null;
function normalizeModelTypeKey(modelType) {
if (!modelType) {
return '';
}
const lower = String(modelType).toLowerCase();
if (MODEL_TYPE_SUGGESTION_KEY_MAP[lower]) {
return MODEL_TYPE_SUGGESTION_KEY_MAP[lower];
}
if (lower.endsWith('s')) {
return lower.slice(0, -1);
}
return lower;
}
function resolveModelTypeKey(modelType = null) {
if (modelType) {
return normalizeModelTypeKey(modelType);
}
if (activeModelTypeKey) {
return activeModelTypeKey;
}
if (state?.currentPageType) {
return normalizeModelTypeKey(state.currentPageType);
}
return '';
}
function resetSuggestionState() {
priorityTagSuggestions = [];
priorityTagSuggestionsLoaded = false;
priorityTagSuggestionsPromise = null;
}
function setActiveModelTypeKey(modelType = null) {
const resolvedKey = resolveModelTypeKey(modelType);
if (resolvedKey === activeModelTypeKey) {
return activeModelTypeKey;
}
activeModelTypeKey = resolvedKey;
resetSuggestionState();
return activeModelTypeKey;
}
function ensurePriorityTagSuggestions(modelType = null) {
if (modelType !== null && modelType !== undefined) {
setActiveModelTypeKey(modelType);
} else if (!activeModelTypeKey) {
setActiveModelTypeKey();
}
if (!activeModelTypeKey) {
resetSuggestionState();
priorityTagSuggestionsLoaded = true;
return Promise.resolve([]);
}
if (priorityTagSuggestionsLoaded && !priorityTagSuggestionsPromise) {
return Promise.resolve(priorityTagSuggestions);
}
if (!priorityTagSuggestionsPromise) {
const requestKey = activeModelTypeKey;
priorityTagSuggestionsPromise = getPriorityTagSuggestions(requestKey)
.then((tags) => {
if (activeModelTypeKey === requestKey) {
priorityTagSuggestions = tags;
priorityTagSuggestionsLoaded = true;
}
return tags;
})
.catch(() => {
if (activeModelTypeKey === requestKey) {
priorityTagSuggestions = [];
priorityTagSuggestionsLoaded = true;
}
return [];
})
.finally(() => {
if (activeModelTypeKey === requestKey) {
priorityTagSuggestionsPromise = null;
}
});
}
return priorityTagSuggestionsPromise;
}
activeModelTypeKey = resolveModelTypeKey();
if (activeModelTypeKey) {
ensurePriorityTagSuggestions();
}
window.addEventListener('lm:priority-tags-updated', () => {
if (!activeModelTypeKey) {
return;
}
resetSuggestionState();
ensurePriorityTagSuggestions().then(() => {
document.querySelectorAll('.metadata-edit-container .metadata-suggestions-container').forEach((container) => {
renderPriorityTagSuggestions(container, getCurrentEditTags());
});
updateSuggestionsDropdown();
});
});
// Create a named function so we can remove it later // Create a named function so we can remove it later
let saveTagsHandler = null; let saveTagsHandler = null;
@@ -12,10 +141,13 @@ let saveTagsHandler = null;
/** /**
* Set up tag editing mode * Set up tag editing mode
*/ */
export function setupTagEditMode() { export function setupTagEditMode(modelType = null) {
const editBtn = document.querySelector('.edit-tags-btn'); const editBtn = document.querySelector('.edit-tags-btn');
if (!editBtn) return; if (!editBtn) return;
setActiveModelTypeKey(modelType);
ensurePriorityTagSuggestions();
// Store original tags for restoring on cancel // Store original tags for restoring on cancel
let originalTags = []; let originalTags = [];
@@ -70,6 +202,7 @@ export function setupTagEditMode() {
// Setup delete buttons for existing tags // Setup delete buttons for existing tags
setupDeleteButtons(); setupDeleteButtons();
setupTagDragAndDrop();
// Transfer click event from original button to the cloned one // Transfer click event from original button to the cloned one
const newEditBtn = editContainer.querySelector('.metadata-header-btn'); const newEditBtn = editContainer.querySelector('.metadata-header-btn');
@@ -85,6 +218,7 @@ export function setupTagEditMode() {
// Just show the existing edit container // Just show the existing edit container
tagsEditContainer.style.display = 'block'; tagsEditContainer.style.display = 'block';
editBtn.style.display = 'none'; editBtn.style.display = 'none';
setupTagDragAndDrop();
} }
} else { } else {
// Exit edit mode // Exit edit mode
@@ -273,9 +407,31 @@ function createSuggestionsDropdown(existingTags = []) {
// Create tag container // Create tag container
const container = document.createElement('div'); const container = document.createElement('div');
container.className = 'metadata-suggestions-container'; container.className = 'metadata-suggestions-container';
if (priorityTagSuggestionsLoaded && !priorityTagSuggestionsPromise) {
renderPriorityTagSuggestions(container, existingTags);
} else {
container.innerHTML = `<div class="metadata-suggestions-loading">${translate('settings.priorityTags.loadingSuggestions', 'Loading suggestions…')}</div>`;
ensurePriorityTagSuggestions().then(() => {
if (!container.isConnected) {
return;
}
renderPriorityTagSuggestions(container, getCurrentEditTags());
updateSuggestionsDropdown();
}).catch(() => {
if (container.isConnected) {
container.innerHTML = '';
}
});
}
// Add each preset tag as a suggestion dropdown.appendChild(container);
PRESET_TAGS.forEach(tag => { return dropdown;
}
function renderPriorityTagSuggestions(container, existingTags = []) {
container.innerHTML = '';
priorityTagSuggestions.forEach((tag) => {
const isAdded = existingTags.includes(tag); const isAdded = existingTags.includes(tag);
const item = document.createElement('div'); const item = document.createElement('div');
@@ -290,23 +446,16 @@ function createSuggestionsDropdown(existingTags = []) {
item.addEventListener('click', () => { item.addEventListener('click', () => {
addNewTag(tag); addNewTag(tag);
// Also populate the input field for potential editing
const input = document.querySelector('.metadata-input'); const input = document.querySelector('.metadata-input');
if (input) input.value = tag; if (input) input.value = tag;
// Focus on the input
if (input) input.focus(); if (input) input.focus();
// Update dropdown without removing it
updateSuggestionsDropdown(); updateSuggestionsDropdown();
}); });
} }
container.appendChild(item); container.appendChild(item);
}); });
dropdown.appendChild(container);
return dropdown;
} }
/** /**
@@ -342,6 +491,213 @@ function setupDeleteButtons() {
}); });
} }
/**
* Enable drag-and-drop sorting for tag items
*/
function setupTagDragAndDrop() {
const container = document.querySelector(METADATA_ITEMS_CONTAINER_SELECTOR);
if (!container) {
return;
}
container.querySelectorAll(METADATA_ITEM_SELECTOR).forEach((item) => {
item.removeAttribute('draggable');
if (item.classList.contains(METADATA_ITEM_PLACEHOLDER_CLASS)) {
return;
}
if (item.dataset.pointerDragInit === 'true') {
return;
}
item.addEventListener('pointerdown', handleTagPointerDown);
item.dataset.pointerDragInit = 'true';
});
}
function handleTagPointerDown(event) {
if (event.button !== 0) {
return;
}
if (event.target.closest('.metadata-delete-btn')) {
return;
}
const item = event.currentTarget;
const container = item?.closest(METADATA_ITEMS_CONTAINER_SELECTOR);
if (!item || !container) {
return;
}
event.preventDefault();
startPointerDrag({ item, container, startEvent: event });
}
function startPointerDrag({ item, container, startEvent }) {
if (activeTagDragState) {
finishPointerDrag();
}
const itemRect = item.getBoundingClientRect();
const placeholder = document.createElement('div');
placeholder.className = `metadata-item ${METADATA_ITEM_PLACEHOLDER_CLASS}`;
placeholder.style.width = `${itemRect.width}px`;
placeholder.style.height = `${itemRect.height}px`;
container.insertBefore(placeholder, item);
item.classList.add(METADATA_ITEM_DRAGGING_CLASS);
item.style.width = `${itemRect.width}px`;
item.style.height = `${itemRect.height}px`;
item.style.position = 'fixed';
item.style.left = `${itemRect.left}px`;
item.style.top = `${itemRect.top}px`;
item.style.pointerEvents = 'none';
item.style.zIndex = '1000';
container.classList.add(METADATA_ITEMS_SORTING_CLASS);
if (document.body) {
document.body.classList.add(BODY_DRAGGING_CLASS);
}
const dragState = {
container,
item,
placeholder,
offsetX: startEvent.clientX - itemRect.left,
offsetY: startEvent.clientY - itemRect.top,
lastKnownPointer: { x: startEvent.clientX, y: startEvent.clientY },
rafId: null,
};
activeTagDragState = dragState;
document.addEventListener('pointermove', handlePointerMove);
document.addEventListener('pointerup', handlePointerUp);
document.addEventListener('pointercancel', handlePointerUp);
}
function handlePointerMove(event) {
if (!activeTagDragState) {
return;
}
activeTagDragState.lastKnownPointer = { x: event.clientX, y: event.clientY };
if (activeTagDragState.rafId !== null) {
return;
}
activeTagDragState.rafId = requestAnimationFrame(() => {
if (!activeTagDragState) {
return;
}
activeTagDragState.rafId = null;
updateDraggingItemPosition();
updatePlaceholderPosition();
});
}
function handlePointerUp() {
finishPointerDrag();
}
function updateDraggingItemPosition() {
if (!activeTagDragState) {
return;
}
const { item, offsetX, offsetY, lastKnownPointer } = activeTagDragState;
const left = lastKnownPointer.x - offsetX;
const top = lastKnownPointer.y - offsetY;
item.style.left = `${left}px`;
item.style.top = `${top}px`;
}
function updatePlaceholderPosition() {
if (!activeTagDragState) {
return;
}
const { container, placeholder, item, lastKnownPointer } = activeTagDragState;
const siblings = Array.from(
container.querySelectorAll(
`${METADATA_ITEM_SELECTOR}:not(.${METADATA_ITEM_PLACEHOLDER_CLASS})`
)
).filter((element) => element !== item);
let insertAfter = null;
for (const sibling of siblings) {
const rect = sibling.getBoundingClientRect();
if (lastKnownPointer.y < rect.top) {
container.insertBefore(placeholder, sibling);
return;
}
if (lastKnownPointer.y <= rect.bottom) {
if (lastKnownPointer.x < rect.left + rect.width / 2) {
container.insertBefore(placeholder, sibling);
return;
}
insertAfter = sibling;
continue;
}
insertAfter = sibling;
}
if (!insertAfter) {
container.insertBefore(placeholder, container.firstElementChild);
return;
}
container.insertBefore(placeholder, insertAfter.nextSibling);
}
function finishPointerDrag() {
if (!activeTagDragState) {
return;
}
const { container, item, placeholder, rafId } = activeTagDragState;
document.removeEventListener('pointermove', handlePointerMove);
document.removeEventListener('pointerup', handlePointerUp);
document.removeEventListener('pointercancel', handlePointerUp);
container.classList.remove(METADATA_ITEMS_SORTING_CLASS);
if (document.body) {
document.body.classList.remove(BODY_DRAGGING_CLASS);
}
if (rafId !== null) {
cancelAnimationFrame(rafId);
activeTagDragState.rafId = null;
updateDraggingItemPosition();
updatePlaceholderPosition();
}
if (placeholder && placeholder.parentNode === container) {
container.insertBefore(item, placeholder);
container.removeChild(placeholder);
}
item.classList.remove(METADATA_ITEM_DRAGGING_CLASS);
item.style.position = '';
item.style.width = '';
item.style.height = '';
item.style.left = '';
item.style.top = '';
item.style.pointerEvents = '';
item.style.zIndex = '';
activeTagDragState = null;
updateSuggestionsDropdown();
}
/** /**
* Add a new tag * Add a new tag
* @param {string} tag - Tag to add * @param {string} tag - Tag to add
@@ -395,6 +751,7 @@ function addNewTag(tag) {
}); });
tagsContainer.appendChild(newTag); tagsContainer.appendChild(newTag);
setupTagDragAndDrop();
// Update status of items in the suggestions dropdown // Update status of items in the suggestions dropdown
updateSuggestionsDropdown(); updateSuggestionsDropdown();
@@ -408,8 +765,7 @@ function updateSuggestionsDropdown() {
if (!dropdown) return; if (!dropdown) return;
// Get all current tags // Get all current tags
const currentTags = document.querySelectorAll('.metadata-item'); const existingTags = getCurrentEditTags();
const existingTags = Array.from(currentTags).map(tag => tag.dataset.tag);
// Update status of each item in dropdown // Update status of each item in dropdown
dropdown.querySelectorAll('.metadata-suggestion-item').forEach(item => { dropdown.querySelectorAll('.metadata-suggestion-item').forEach(item => {
@@ -456,6 +812,15 @@ function updateSuggestionsDropdown() {
}); });
} }
function getCurrentEditTags() {
const currentTags = document.querySelectorAll(
`${METADATA_ITEM_SELECTOR}[data-tag]`
);
return Array.from(currentTags)
.map(tag => tag.dataset.tag)
.filter(Boolean);
}
/** /**
* Restore original tags when canceling edit * Restore original tags when canceling edit
* @param {HTMLElement} section - The tags section * @param {HTMLElement} section - The tags section

View File

@@ -4,7 +4,8 @@ import { updateCardsForBulkMode } from '../components/shared/ModelCard.js';
import { modalManager } from './ModalManager.js'; import { modalManager } from './ModalManager.js';
import { getModelApiClient, resetAndReload } from '../api/modelApiFactory.js'; import { getModelApiClient, resetAndReload } from '../api/modelApiFactory.js';
import { MODEL_TYPES, MODEL_CONFIG } from '../api/apiConfig.js'; import { MODEL_TYPES, MODEL_CONFIG } from '../api/apiConfig.js';
import { PRESET_TAGS, BASE_MODEL_CATEGORIES } from '../utils/constants.js'; import { BASE_MODEL_CATEGORIES } from '../utils/constants.js';
import { getPriorityTagSuggestions } from '../utils/priorityTagHelpers.js';
import { eventManager } from '../utils/EventManager.js'; import { eventManager } from '../utils/EventManager.js';
import { translate } from '../utils/i18nHelpers.js'; import { translate } from '../utils/i18nHelpers.js';
@@ -59,6 +60,26 @@ export class BulkManager {
setContentRating: true setContentRating: true
} }
}; };
window.addEventListener('lm:priority-tags-updated', () => {
const container = document.querySelector('#bulkAddTagsModal .metadata-suggestions-container');
if (!container) {
return;
}
const currentType = state.currentPageType;
if (!currentType || currentType === 'recipes') {
return;
}
getPriorityTagSuggestions(currentType).then((tags) => {
if (!container.isConnected) {
return;
}
this.renderBulkSuggestionItems(container, tags);
this.updateBulkSuggestionsDropdown();
}).catch(() => {
// Ignore refresh failures; UI will retry on next open
});
});
} }
initialize() { initialize() {
@@ -565,7 +586,7 @@ export class BulkManager {
// Create suggestions dropdown // Create suggestions dropdown
const tagForm = document.querySelector('#bulkAddTagsModal .metadata-add-form'); const tagForm = document.querySelector('#bulkAddTagsModal .metadata-add-form');
if (tagForm) { if (tagForm) {
const suggestionsDropdown = this.createBulkSuggestionsDropdown(PRESET_TAGS); const suggestionsDropdown = this.createBulkSuggestionsDropdown();
tagForm.appendChild(suggestionsDropdown); tagForm.appendChild(suggestionsDropdown);
} }
@@ -586,7 +607,7 @@ export class BulkManager {
} }
} }
createBulkSuggestionsDropdown(presetTags) { createBulkSuggestionsDropdown() {
const dropdown = document.createElement('div'); const dropdown = document.createElement('div');
dropdown.className = 'metadata-suggestions-dropdown'; dropdown.className = 'metadata-suggestions-dropdown';
@@ -600,9 +621,33 @@ export class BulkManager {
const container = document.createElement('div'); const container = document.createElement('div');
container.className = 'metadata-suggestions-container'; container.className = 'metadata-suggestions-container';
container.innerHTML = `<div class="metadata-suggestions-loading">${translate('settings.priorityTags.loadingSuggestions', 'Loading suggestions…')}</div>`;
presetTags.forEach(tag => { const currentType = state.currentPageType;
// Check if tag is already added if (!currentType || currentType === 'recipes') {
container.innerHTML = '';
} else {
getPriorityTagSuggestions(currentType).then((tags) => {
if (!container.isConnected) {
return;
}
this.renderBulkSuggestionItems(container, tags);
this.updateBulkSuggestionsDropdown();
}).catch(() => {
if (container.isConnected) {
container.innerHTML = '';
}
});
}
dropdown.appendChild(container);
return dropdown;
}
renderBulkSuggestionItems(container, tags) {
container.innerHTML = '';
tags.forEach(tag => {
const existingTags = this.getBulkExistingTags(); const existingTags = this.getBulkExistingTags();
const isAdded = existingTags.includes(tag); const isAdded = existingTags.includes(tag);
@@ -622,16 +667,12 @@ export class BulkManager {
input.value = tag; input.value = tag;
input.focus(); input.focus();
} }
// Update dropdown to show added indicator
this.updateBulkSuggestionsDropdown(); this.updateBulkSuggestionsDropdown();
}); });
} }
container.appendChild(item); container.appendChild(item);
}); });
dropdown.appendChild(container);
return dropdown;
} }
addBulkTag(tag) { addBulkTag(tag) {
@@ -1105,10 +1146,7 @@ export class BulkManager {
// Call the auto-organize method with selected file paths // Call the auto-organize method with selected file paths
await apiClient.autoOrganizeModels(filePaths); await apiClient.autoOrganizeModels(filePaths);
setTimeout(() => { resetAndReload(true);
resetAndReload(true);
}, 1000);
} catch (error) { } catch (error) {
console.error('Error during bulk auto-organize:', error); console.error('Error during bulk auto-organize:', error);
showToast('toast.loras.autoOrganizeFailed', { error: error.message }, 'error'); showToast('toast.loras.autoOrganizeFailed', { error: error.message }, 'error');

View File

@@ -14,6 +14,7 @@ export class DownloadManager {
this.modelInfo = null; this.modelInfo = null;
this.modelVersionId = null; this.modelVersionId = null;
this.modelId = null; this.modelId = null;
this.source = null;
this.initialized = false; this.initialized = false;
this.selectedFolder = ''; this.selectedFolder = '';
@@ -126,6 +127,7 @@ export class DownloadManager {
this.modelInfo = null; this.modelInfo = null;
this.modelId = null; this.modelId = null;
this.modelVersionId = null; this.modelVersionId = null;
this.source = null;
this.selectedFolder = ''; this.selectedFolder = '';
@@ -150,7 +152,7 @@ export class DownloadManager {
throw new Error(translate('modals.download.errors.invalidUrl')); throw new Error(translate('modals.download.errors.invalidUrl'));
} }
this.versions = await this.apiClient.fetchCivitaiVersions(this.modelId); this.versions = await this.apiClient.fetchCivitaiVersions(this.modelId, this.source);
if (!this.versions.length) { if (!this.versions.length) {
throw new Error(translate('modals.download.errors.noVersions')); throw new Error(translate('modals.download.errors.noVersions'));
@@ -170,13 +172,22 @@ export class DownloadManager {
} }
extractModelId(url) { extractModelId(url) {
const modelMatch = url.match(/civitai\.com\/models\/(\d+)/); const versionMatch = url.match(/modelVersionId=(\d+)/i);
const versionMatch = url.match(/modelVersionId=(\d+)/); this.modelVersionId = versionMatch ? versionMatch[1] : null;
if (modelMatch) { const civarchiveMatch = url.match(/https?:\/\/(?:www\.)?(?:civitaiarchive|civarchive)\.com\/models\/(\d+)/i);
this.modelVersionId = versionMatch ? versionMatch[1] : null; if (civarchiveMatch) {
return modelMatch[1]; this.source = 'civarchive';
return civarchiveMatch[1];
} }
const civitaiMatch = url.match(/https?:\/\/(?:www\.)?civitai\.com\/models\/(\d+)/i);
if (civitaiMatch) {
this.source = null;
return civitaiMatch[1];
}
this.source = null;
return null; return null;
} }
@@ -453,7 +464,13 @@ export class DownloadManager {
} }
if (data.status === 'progress' && data.download_id === downloadId) { if (data.status === 'progress' && data.download_id === downloadId) {
updateProgress(data.progress, 0, this.currentVersion.name); const metrics = {
bytesDownloaded: data.bytes_downloaded,
totalBytes: data.total_bytes,
bytesPerSecond: data.bytes_per_second
};
updateProgress(data.progress, 0, this.currentVersion.name, metrics);
if (data.progress < 3) { if (data.progress < 3) {
this.loadingManager.setStatus(translate('modals.download.status.preparing')); this.loadingManager.setStatus(translate('modals.download.status.preparing'));
@@ -478,7 +495,8 @@ export class DownloadManager {
modelRoot, modelRoot,
targetFolder, targetFolder,
useDefaultPaths, useDefaultPaths,
downloadId downloadId,
this.source
); );
showToast('toast.loras.downloadCompleted', {}, 'success'); showToast('toast.loras.downloadCompleted', {}, 'success');

View File

@@ -13,8 +13,10 @@ export class ExampleImagesManager {
this.progressPanel = null; this.progressPanel = null;
this.isProgressPanelCollapsed = false; this.isProgressPanelCollapsed = false;
this.pauseButton = null; // Store reference to the pause button this.pauseButton = null; // Store reference to the pause button
this.stopButton = null;
this.isMigrating = false; // Track migration state separately from downloading this.isMigrating = false; // Track migration state separately from downloading
this.hasShownCompletionToast = false; // Flag to track if completion toast has been shown this.hasShownCompletionToast = false; // Flag to track if completion toast has been shown
this.isStopping = false;
// Auto download properties // Auto download properties
this.autoDownloadInterval = null; this.autoDownloadInterval = null;
@@ -52,12 +54,17 @@ export class ExampleImagesManager {
// Initialize progress panel button handlers // Initialize progress panel button handlers
this.pauseButton = document.getElementById('pauseExampleDownloadBtn'); this.pauseButton = document.getElementById('pauseExampleDownloadBtn');
this.stopButton = document.getElementById('stopExampleDownloadBtn');
const collapseBtn = document.getElementById('collapseProgressBtn'); const collapseBtn = document.getElementById('collapseProgressBtn');
if (this.pauseButton) { if (this.pauseButton) {
this.pauseButton.onclick = () => this.pauseDownload(); this.pauseButton.onclick = () => this.pauseDownload();
} }
if (this.stopButton) {
this.stopButton.onclick = () => this.stopDownload();
}
if (collapseBtn) { if (collapseBtn) {
collapseBtn.onclick = () => this.toggleProgressPanel(); collapseBtn.onclick = () => this.toggleProgressPanel();
} }
@@ -210,10 +217,14 @@ export class ExampleImagesManager {
updateDownloadButtonText() { updateDownloadButtonText() {
const btnTextElement = document.getElementById('exampleDownloadBtnText'); const btnTextElement = document.getElementById('exampleDownloadBtnText');
if (btnTextElement) { if (btnTextElement) {
if (this.isDownloading && this.isPaused) { if (this.isStopping) {
btnTextElement.textContent = "Stopping...";
} else if (this.isDownloading && this.isPaused) {
btnTextElement.textContent = "Resume"; btnTextElement.textContent = "Resume";
} else if (!this.isDownloading) { } else if (!this.isDownloading) {
btnTextElement.textContent = "Download"; btnTextElement.textContent = "Download";
} else {
btnTextElement.textContent = "Download";
} }
} }
} }
@@ -243,12 +254,16 @@ export class ExampleImagesManager {
if (data.success) { if (data.success) {
this.isDownloading = true; this.isDownloading = true;
this.isPaused = false; this.isPaused = false;
this.isStopping = false;
this.hasShownCompletionToast = false; // Reset toast flag when starting new download this.hasShownCompletionToast = false; // Reset toast flag when starting new download
this.startTime = new Date(); this.startTime = new Date();
this.updateUI(data.status); this.updateUI(data.status);
this.showProgressPanel(); this.showProgressPanel();
this.startProgressUpdates(); this.startProgressUpdates();
this.updateDownloadButtonText(); this.updateDownloadButtonText();
if (this.stopButton) {
this.stopButton.disabled = false;
}
showToast('toast.exampleImages.downloadStarted', {}, 'success'); showToast('toast.exampleImages.downloadStarted', {}, 'success');
// Close settings modal // Close settings modal
@@ -263,7 +278,7 @@ export class ExampleImagesManager {
} }
async pauseDownload() { async pauseDownload() {
if (!this.isDownloading || this.isPaused) { if (!this.isDownloading || this.isPaused || this.isStopping) {
return; return;
} }
@@ -299,7 +314,7 @@ export class ExampleImagesManager {
} }
async resumeDownload() { async resumeDownload() {
if (!this.isDownloading || !this.isPaused) { if (!this.isDownloading || !this.isPaused || this.isStopping) {
return; return;
} }
@@ -334,6 +349,60 @@ export class ExampleImagesManager {
} }
} }
async stopDownload() {
if (this.isStopping) {
return;
}
if (!this.isDownloading) {
this.hideProgressPanel();
return;
}
this.isStopping = true;
this.isPaused = false;
this.updateDownloadButtonText();
if (this.stopButton) {
this.stopButton.disabled = true;
}
try {
const response = await fetch('/api/lm/stop-example-images', {
method: 'POST'
});
let data;
try {
data = await response.json();
} catch (parseError) {
data = { success: false, error: 'Invalid server response' };
}
if (response.ok && data.success) {
showToast('toast.exampleImages.downloadStopped', {}, 'info');
this.hideProgressPanel();
} else {
this.isStopping = false;
if (this.stopButton) {
this.stopButton.disabled = false;
}
const errorMessage = data && data.error ? data.error : 'Unknown error';
showToast('toast.exampleImages.stopFailed', { error: errorMessage }, 'error');
}
} catch (error) {
console.error('Failed to stop download:', error);
this.isStopping = false;
if (this.stopButton) {
this.stopButton.disabled = false;
}
const errorMessage = error && error.message ? error.message : 'Unknown error';
showToast('toast.exampleImages.stopFailed', { error: errorMessage }, 'error');
} finally {
this.updateDownloadButtonText();
}
}
startProgressUpdates() { startProgressUpdates() {
// Clear any existing interval // Clear any existing interval
if (this.progressUpdateInterval) { if (this.progressUpdateInterval) {
@@ -352,10 +421,22 @@ export class ExampleImagesManager {
const data = await response.json(); const data = await response.json();
if (data.success) { if (data.success) {
const currentStatus = data.status.status;
this.isDownloading = data.is_downloading; this.isDownloading = data.is_downloading;
this.isPaused = data.status.status === 'paused'; this.isPaused = currentStatus === 'paused';
this.isMigrating = data.is_migrating || false; this.isMigrating = data.is_migrating || false;
if (currentStatus === 'stopping') {
this.isStopping = true;
} else if (
!data.is_downloading ||
currentStatus === 'stopped' ||
currentStatus === 'completed' ||
currentStatus === 'error'
) {
this.isStopping = false;
}
// Update download button text // Update download button text
this.updateDownloadButtonText(); this.updateDownloadButtonText();
@@ -365,8 +446,11 @@ export class ExampleImagesManager {
// Download completed or failed // Download completed or failed
clearInterval(this.progressUpdateInterval); clearInterval(this.progressUpdateInterval);
this.progressUpdateInterval = null; this.progressUpdateInterval = null;
if (this.stopButton) {
this.stopButton.disabled = true;
}
if (data.status.status === 'completed' && !this.hasShownCompletionToast) { if (currentStatus === 'completed' && !this.hasShownCompletionToast) {
const actionType = this.isMigrating ? 'migration' : 'download'; const actionType = this.isMigrating ? 'migration' : 'download';
showToast('toast.downloads.imagesCompleted', { action: actionType }, 'success'); showToast('toast.downloads.imagesCompleted', { action: actionType }, 'success');
// Mark as shown to prevent duplicate toasts // Mark as shown to prevent duplicate toasts
@@ -375,10 +459,13 @@ export class ExampleImagesManager {
this.isMigrating = false; this.isMigrating = false;
// Hide the panel after a delay // Hide the panel after a delay
setTimeout(() => this.hideProgressPanel(), 5000); setTimeout(() => this.hideProgressPanel(), 5000);
} else if (data.status.status === 'error') { } else if (currentStatus === 'error') {
const actionType = this.isMigrating ? 'migration' : 'download'; const actionType = this.isMigrating ? 'migration' : 'download';
showToast('toast.downloads.imagesFailed', { action: actionType }, 'error'); showToast('toast.downloads.imagesFailed', { action: actionType }, 'error');
this.isMigrating = false; this.isMigrating = false;
} else if (currentStatus === 'stopped') {
this.hideProgressPanel();
this.isMigrating = false;
} }
} }
} }
@@ -435,6 +522,10 @@ export class ExampleImagesManager {
this.pauseButton = document.getElementById('pauseExampleDownloadBtn'); this.pauseButton = document.getElementById('pauseExampleDownloadBtn');
} }
if (!this.stopButton) {
this.stopButton = document.getElementById('stopExampleDownloadBtn');
}
if (this.pauseButton) { if (this.pauseButton) {
// Check if the button already has the SVG elements // Check if the button already has the SVG elements
let hasProgressElements = !!this.pauseButton.querySelector('.mini-progress-circle'); let hasProgressElements = !!this.pauseButton.querySelector('.mini-progress-circle');
@@ -462,6 +553,8 @@ export class ExampleImagesManager {
? () => this.resumeDownload() ? () => this.resumeDownload()
: () => this.pauseDownload(); : () => this.pauseDownload();
this.pauseButton.disabled = ['completed', 'error', 'stopped'].includes(status.status) || status.status === 'stopping';
// Update progress immediately // Update progress immediately
const progressBar = document.getElementById('downloadProgressBar'); const progressBar = document.getElementById('downloadProgressBar');
if (progressBar) { if (progressBar) {
@@ -470,6 +563,15 @@ export class ExampleImagesManager {
} }
} }
if (this.stopButton) {
if (status.status === 'stopping' || this.isStopping) {
this.stopButton.disabled = true;
} else {
const canStop = ['running', 'paused'].includes(status.status);
this.stopButton.disabled = !canStop;
}
}
// Update title text // Update title text
const titleElement = document.querySelector('.progress-panel-title'); const titleElement = document.querySelector('.progress-panel-title');
if (titleElement) { if (titleElement) {
@@ -584,6 +686,8 @@ export class ExampleImagesManager {
case 'paused': return 'Paused'; case 'paused': return 'Paused';
case 'completed': return 'Completed'; case 'completed': return 'Completed';
case 'error': return 'Error'; case 'error': return 'Error';
case 'stopping': return 'Stopping';
case 'stopped': return 'Stopped';
default: return 'Initializing'; default: return 'Initializing';
} }
} }

View File

@@ -6,7 +6,7 @@ import { getStorageItem, setStorageItem } from '../utils/storageHelpers.js';
export class HelpManager { export class HelpManager {
constructor() { constructor() {
this.lastViewedTimestamp = getStorageItem('help_last_viewed', 0); this.lastViewedTimestamp = getStorageItem('help_last_viewed', 0);
this.latestContentTimestamp = new Date('2025-07-09').getTime(); // Will be updated from server or config this.latestContentTimestamp = new Date('2025-10-11').getTime(); // Will be updated from server or config
this.isInitialized = false; this.isInitialized = false;
// Default latest content data - could be fetched from server // Default latest content data - could be fetched from server

View File

@@ -1,3 +1,6 @@
import { translate } from '../utils/i18nHelpers.js';
import { formatFileSize } from '../utils/formatters.js';
// Loading management // Loading management
export class LoadingManager { export class LoadingManager {
constructor() { constructor() {
@@ -65,7 +68,7 @@ export class LoadingManager {
// Show enhanced progress for downloads // Show enhanced progress for downloads
showDownloadProgress(totalItems = 1) { showDownloadProgress(totalItems = 1) {
this.show('Preparing download...', 0); this.show(translate('modals.download.status.preparing', {}, 'Preparing download...'), 0);
// Create details container // Create details container
const detailsContainer = this.createDetailsContainer(); const detailsContainer = this.createDetailsContainer();
@@ -76,7 +79,7 @@ export class LoadingManager {
const currentItemLabel = document.createElement('div'); const currentItemLabel = document.createElement('div');
currentItemLabel.className = 'current-item-label'; currentItemLabel.className = 'current-item-label';
currentItemLabel.textContent = 'Current file:'; currentItemLabel.textContent = translate('modals.download.progress.currentFile', {}, 'Current file:');
const currentItemBar = document.createElement('div'); const currentItemBar = document.createElement('div');
currentItemBar.className = 'current-item-bar-container'; currentItemBar.className = 'current-item-bar-container';
@@ -106,15 +109,95 @@ export class LoadingManager {
// Add current item progress to container // Add current item progress to container
detailsContainer.appendChild(currentItemContainer); detailsContainer.appendChild(currentItemContainer);
// Create transfer stats container
const transferStats = document.createElement('div');
transferStats.className = 'download-transfer-stats';
const bytesDetail = document.createElement('div');
bytesDetail.className = 'download-transfer-bytes';
bytesDetail.textContent = translate(
'modals.download.progress.transferredUnknown',
{},
'Transferred: --'
);
const speedDetail = document.createElement('div');
speedDetail.className = 'download-transfer-speed';
speedDetail.textContent = translate(
'modals.download.progress.speed',
{ speed: '--' },
'Speed: --'
);
transferStats.appendChild(bytesDetail);
transferStats.appendChild(speedDetail);
detailsContainer.appendChild(transferStats);
const formatMetricSize = (value) => {
if (value === undefined || value === null || isNaN(value)) {
return '--';
}
if (value < 1) {
return '0 B';
}
return formatFileSize(value);
};
const updateTransferStats = (metrics = {}) => {
const { bytesDownloaded, totalBytes, bytesPerSecond } = metrics;
if (bytesDetail) {
const formattedDownloaded = formatMetricSize(bytesDownloaded);
const formattedTotal = formatMetricSize(totalBytes);
if (formattedDownloaded === '--' && formattedTotal === '--') {
bytesDetail.textContent = translate(
'modals.download.progress.transferredUnknown',
{},
'Transferred: --'
);
} else if (formattedTotal === '--') {
bytesDetail.textContent = translate(
'modals.download.progress.transferredSimple',
{ downloaded: formattedDownloaded },
`Transferred: ${formattedDownloaded}`
);
} else {
bytesDetail.textContent = translate(
'modals.download.progress.transferred',
{ downloaded: formattedDownloaded, total: formattedTotal },
`Transferred: ${formattedDownloaded} / ${formattedTotal}`
);
}
}
if (speedDetail) {
const formattedSpeed = formatMetricSize(bytesPerSecond);
const displaySpeed = formattedSpeed === '--' ? '--' : `${formattedSpeed}/s`;
speedDetail.textContent = translate(
'modals.download.progress.speed',
{ speed: displaySpeed },
`Speed: ${displaySpeed}`
);
}
};
// Initialize transfer stats with empty data
updateTransferStats();
// Return update function // Return update function
return (currentProgress, currentIndex = 0, currentName = '') => { return (currentProgress, currentIndex = 0, currentName = '', metrics = {}) => {
// Update current item progress // Update current item progress
currentItemProgress.style.width = `${currentProgress}%`; currentItemProgress.style.width = `${currentProgress}%`;
currentItemPercent.textContent = `${Math.floor(currentProgress)}%`; currentItemPercent.textContent = `${Math.floor(currentProgress)}%`;
// Update current item label if name provided // Update current item label if name provided
if (currentName) { if (currentName) {
currentItemLabel.textContent = `Downloading: ${currentName}`; currentItemLabel.textContent = translate(
'modals.download.progress.downloading',
{ name: currentName },
`Downloading: ${currentName}`
);
} }
// Update overall label if multiple items // Update overall label if multiple items
@@ -128,6 +211,8 @@ export class LoadingManager {
// Single item, just update main progress // Single item, just update main progress
this.setProgress(currentProgress); this.setProgress(currentProgress);
} }
updateTransferStats(metrics);
}; };
} }

View File

@@ -2,10 +2,11 @@ import { modalManager } from './ModalManager.js';
import { showToast } from '../utils/uiHelpers.js'; import { showToast } from '../utils/uiHelpers.js';
import { state, createDefaultSettings } from '../state/index.js'; import { state, createDefaultSettings } from '../state/index.js';
import { resetAndReload } from '../api/modelApiFactory.js'; import { resetAndReload } from '../api/modelApiFactory.js';
import { DOWNLOAD_PATH_TEMPLATES, MAPPABLE_BASE_MODELS, PATH_TEMPLATE_PLACEHOLDERS, DEFAULT_PATH_TEMPLATES } from '../utils/constants.js'; import { DOWNLOAD_PATH_TEMPLATES, MAPPABLE_BASE_MODELS, PATH_TEMPLATE_PLACEHOLDERS, DEFAULT_PATH_TEMPLATES, DEFAULT_PRIORITY_TAG_CONFIG } from '../utils/constants.js';
import { translate } from '../utils/i18nHelpers.js'; import { translate } from '../utils/i18nHelpers.js';
import { i18n } from '../i18n/index.js'; import { i18n } from '../i18n/index.js';
import { configureModelCardVideo } from '../components/shared/ModelCard.js'; import { configureModelCardVideo } from '../components/shared/ModelCard.js';
import { validatePriorityTagString, getPriorityTagSuggestionsMap, invalidatePriorityTagSuggestionsCache } from '../utils/priorityTagHelpers.js';
export class SettingsManager { export class SettingsManager {
constructor() { constructor() {
@@ -111,6 +112,17 @@ export class SettingsManager {
merged.download_path_templates = { ...DEFAULT_PATH_TEMPLATES, ...templates }; merged.download_path_templates = { ...DEFAULT_PATH_TEMPLATES, ...templates };
const priorityTags = backendSettings?.priority_tags;
const normalizedPriority = { ...DEFAULT_PRIORITY_TAG_CONFIG };
if (priorityTags && typeof priorityTags === 'object' && !Array.isArray(priorityTags)) {
Object.entries(priorityTags).forEach(([modelType, configValue]) => {
if (typeof configValue === 'string') {
normalizedPriority[modelType] = configValue.trim();
}
});
}
merged.priority_tags = normalizedPriority;
Object.keys(merged).forEach(key => this.backendSettingKeys.add(key)); Object.keys(merged).forEach(key => this.backendSettingKeys.add(key));
return merged; return merged;
@@ -185,7 +197,7 @@ export class SettingsManager {
button.addEventListener('click', () => this.toggleInputVisibility(button)); button.addEventListener('click', () => this.toggleInputVisibility(button));
}); });
const openSettingsLocationButton = document.querySelector('.settings-open-location-button'); const openSettingsLocationButton = document.querySelector('.settings-open-location-trigger');
if (openSettingsLocationButton) { if (openSettingsLocationButton) {
openSettingsLocationButton.addEventListener('click', () => { openSettingsLocationButton.addEventListener('click', () => {
const filePath = openSettingsLocationButton.dataset.settingsPath; const filePath = openSettingsLocationButton.dataset.settingsPath;
@@ -217,6 +229,8 @@ export class SettingsManager {
} }
}); });
this.setupPriorityTagInputs();
this.initialized = true; this.initialized = true;
} }
@@ -276,6 +290,12 @@ export class SettingsManager {
cardInfoDisplaySelect.value = state.global.settings.card_info_display || 'always'; cardInfoDisplaySelect.value = state.global.settings.card_info_display || 'always';
} }
// Set model name display setting
const modelNameDisplaySelect = document.getElementById('modelNameDisplay');
if (modelNameDisplaySelect) {
modelNameDisplaySelect.value = state.global.settings.model_name_display || 'model_name';
}
// Set optimize example images setting // Set optimize example images setting
const optimizeExampleImagesCheckbox = document.getElementById('optimizeExampleImages'); const optimizeExampleImagesCheckbox = document.getElementById('optimizeExampleImages');
if (optimizeExampleImagesCheckbox) { if (optimizeExampleImagesCheckbox) {
@@ -291,6 +311,9 @@ export class SettingsManager {
// Load download path templates // Load download path templates
this.loadDownloadPathTemplates(); this.loadDownloadPathTemplates();
// Load priority tag settings
this.loadPriorityTagSettings();
// Set include trigger words setting // Set include trigger words setting
const includeTriggerWordsCheckbox = document.getElementById('includeTriggerWords'); const includeTriggerWordsCheckbox = document.getElementById('includeTriggerWords');
if (includeTriggerWordsCheckbox) { if (includeTriggerWordsCheckbox) {
@@ -325,6 +348,145 @@ export class SettingsManager {
this.loadProxySettings(); this.loadProxySettings();
} }
setupPriorityTagInputs() {
['lora', 'checkpoint', 'embedding'].forEach((modelType) => {
const textarea = document.getElementById(`${modelType}PriorityTagsInput`);
if (!textarea) {
return;
}
textarea.addEventListener('input', () => this.handlePriorityTagInput(modelType));
textarea.addEventListener('blur', () => this.handlePriorityTagSave(modelType));
textarea.addEventListener('keydown', (event) => this.handlePriorityTagKeyDown(event, modelType));
});
}
loadPriorityTagSettings() {
const priorityConfig = state.global.settings.priority_tags || {};
['lora', 'checkpoint', 'embedding'].forEach((modelType) => {
const textarea = document.getElementById(`${modelType}PriorityTagsInput`);
if (!textarea) {
return;
}
const storedValue = priorityConfig[modelType] ?? DEFAULT_PRIORITY_TAG_CONFIG[modelType] ?? '';
textarea.value = storedValue;
this.displayPriorityTagValidation(modelType, true, []);
});
}
handlePriorityTagInput(modelType) {
const textarea = document.getElementById(`${modelType}PriorityTagsInput`);
if (!textarea) {
return;
}
const validation = validatePriorityTagString(textarea.value);
this.displayPriorityTagValidation(modelType, validation.valid, validation.errors);
}
handlePriorityTagKeyDown(event, modelType) {
if (event.key !== 'Enter') {
return;
}
if (event.shiftKey) {
return;
}
event.preventDefault();
this.handlePriorityTagSave(modelType);
}
async handlePriorityTagSave(modelType) {
const textarea = document.getElementById(`${modelType}PriorityTagsInput`);
if (!textarea) {
return;
}
const validation = validatePriorityTagString(textarea.value);
if (!validation.valid) {
this.displayPriorityTagValidation(modelType, false, validation.errors);
return;
}
const sanitized = validation.formatted;
const currentValue = state.global.settings.priority_tags?.[modelType] || '';
this.displayPriorityTagValidation(modelType, true, []);
if (sanitized === currentValue) {
textarea.value = sanitized;
return;
}
const updatedConfig = {
...state.global.settings.priority_tags,
[modelType]: sanitized,
};
try {
textarea.value = sanitized;
await this.saveSetting('priority_tags', updatedConfig);
showToast('settings.priorityTags.saveSuccess', {}, 'success');
await this.refreshPriorityTagSuggestions();
} catch (error) {
console.error('Failed to save priority tag configuration:', error);
showToast('settings.priorityTags.saveError', {}, 'error');
}
}
displayPriorityTagValidation(modelType, isValid, errors = []) {
const textarea = document.getElementById(`${modelType}PriorityTagsInput`);
const errorElement = document.getElementById(`${modelType}PriorityTagsError`);
if (!textarea) {
return;
}
if (isValid || errors.length === 0) {
textarea.classList.remove('settings-input-error');
if (errorElement) {
errorElement.textContent = '';
errorElement.style.display = 'none';
}
return;
}
textarea.classList.add('settings-input-error');
if (errorElement) {
const message = this.getPriorityTagErrorMessage(errors[0]);
errorElement.textContent = message;
errorElement.style.display = 'block';
}
}
getPriorityTagErrorMessage(error) {
if (!error) {
return '';
}
const entryIndex = error.index ?? 0;
switch (error.type) {
case 'missingClosingParen':
return translate('settings.priorityTags.validation.missingClosingParen', { index: entryIndex }, `Entry ${entryIndex} is missing a closing parenthesis.`);
case 'missingCanonical':
return translate('settings.priorityTags.validation.missingCanonical', { index: entryIndex }, `Entry ${entryIndex} must include a canonical tag.`);
case 'duplicateCanonical':
return translate('settings.priorityTags.validation.duplicateCanonical', { tag: error.canonical }, `The canonical tag "${error.canonical}" is duplicated.`);
default:
return translate('settings.priorityTags.validation.unknown', {}, 'Invalid priority tag configuration.');
}
}
async refreshPriorityTagSuggestions() {
invalidatePriorityTagSuggestionsCache();
try {
await getPriorityTagSuggestionsMap();
window.dispatchEvent(new CustomEvent('lm:priority-tags-updated'));
} catch (error) {
console.warn('Failed to refresh priority tag suggestions:', error);
}
}
loadProxySettings() { loadProxySettings() {
// Load proxy enabled setting // Load proxy enabled setting
const proxyEnabledCheckbox = document.getElementById('proxyEnabled'); const proxyEnabledCheckbox = document.getElementById('proxyEnabled');
@@ -1055,6 +1217,10 @@ export class SettingsManager {
} }
showToast('toast.settings.settingsUpdated', { setting: settingKey.replace(/_/g, ' ') }, 'success'); showToast('toast.settings.settingsUpdated', { setting: settingKey.replace(/_/g, ' ') }, 'success');
if (settingKey === 'model_name_display') {
this.reloadContent();
}
} catch (error) { } catch (error) {
showToast('toast.settings.settingSaveFailed', { message: error.message }, 'error'); showToast('toast.settings.settingSaveFailed', { message: error.message }, 'error');
} }

View File

@@ -164,7 +164,13 @@ export class DownloadManager {
const loraName = currentLora ? currentLora.name : ''; const loraName = currentLora ? currentLora.name : '';
// Update progress display // Update progress display
updateProgress(currentLoraProgress, completedDownloads, loraName); const metrics = {
bytesDownloaded: data.bytes_downloaded,
totalBytes: data.total_bytes,
bytesPerSecond: data.bytes_per_second
};
updateProgress(currentLoraProgress, completedDownloads, loraName, metrics);
// Add more detailed status messages based on progress // Add more detailed status messages based on progress
if (currentLoraProgress < 3) { if (currentLoraProgress < 3) {

View File

@@ -1,7 +1,7 @@
// Create the new hierarchical state structure // Create the new hierarchical state structure
import { getStorageItem, getMapFromStorage } from '../utils/storageHelpers.js'; import { getStorageItem, getMapFromStorage } from '../utils/storageHelpers.js';
import { MODEL_TYPES } from '../api/apiConfig.js'; import { MODEL_TYPES } from '../api/apiConfig.js';
import { DEFAULT_PATH_TEMPLATES } from '../utils/constants.js'; import { DEFAULT_PATH_TEMPLATES, DEFAULT_PRIORITY_TAG_CONFIG } from '../utils/constants.js';
const DEFAULT_SETTINGS_BASE = Object.freeze({ const DEFAULT_SETTINGS_BASE = Object.freeze({
civitai_api_key: '', civitai_api_key: '',
@@ -26,8 +26,10 @@ const DEFAULT_SETTINGS_BASE = Object.freeze({
autoplay_on_hover: false, autoplay_on_hover: false,
display_density: 'default', display_density: 'default',
card_info_display: 'always', card_info_display: 'always',
model_name_display: 'model_name',
include_trigger_words: false, include_trigger_words: false,
compact_mode: false, compact_mode: false,
priority_tags: { ...DEFAULT_PRIORITY_TAG_CONFIG },
}); });
export function createDefaultSettings() { export function createDefaultSettings() {
@@ -35,6 +37,7 @@ export function createDefaultSettings() {
...DEFAULT_SETTINGS_BASE, ...DEFAULT_SETTINGS_BASE,
base_model_path_mappings: {}, base_model_path_mappings: {},
download_path_templates: { ...DEFAULT_PATH_TEMPLATES }, download_path_templates: { ...DEFAULT_PATH_TEMPLATES },
priority_tags: { ...DEFAULT_PRIORITY_TAG_CONFIG },
}; };
} }

View File

@@ -194,10 +194,16 @@ export const BASE_MODEL_CATEGORIES = {
] ]
}; };
// Preset tag suggestions // Default priority tag entries for fallback suggestions and initial settings
export const PRESET_TAGS = [ export const DEFAULT_PRIORITY_TAG_ENTRIES = [
'character', 'concept', 'clothing', 'character', 'concept', 'clothing',
'realistic', 'anime', 'toon', 'furry', 'style', 'realistic', 'anime', 'toon', 'furry', 'style',
'poses', 'background', 'vehicle', 'buildings', 'poses', 'background', 'tool', 'vehicle', 'buildings',
'objects', 'animal' 'objects', 'assets', 'animal', 'action'
]; ];
export const DEFAULT_PRIORITY_TAG_CONFIG = {
lora: DEFAULT_PRIORITY_TAG_ENTRIES.join(', '),
checkpoint: DEFAULT_PRIORITY_TAG_ENTRIES.join(', '),
embedding: DEFAULT_PRIORITY_TAG_ENTRIES.join(', ')
};

View File

@@ -1,8 +1,6 @@
import { modalManager } from '../managers/ModalManager.js'; import { modalManager } from '../managers/ModalManager.js';
import { getModelApiClient } from '../api/modelApiFactory.js'; import { getModelApiClient } from '../api/modelApiFactory.js';
const apiClient = getModelApiClient();
let pendingDeletePath = null; let pendingDeletePath = null;
let pendingExcludePath = null; let pendingExcludePath = null;
@@ -27,7 +25,7 @@ export async function confirmDelete() {
if (!pendingDeletePath) return; if (!pendingDeletePath) return;
try { try {
await apiClient.deleteModel(pendingDeletePath); await getModelApiClient().deleteModel(pendingDeletePath);
closeDeleteModal(); closeDeleteModal();
@@ -72,7 +70,7 @@ export async function confirmExclude() {
if (!pendingExcludePath) return; if (!pendingExcludePath) return;
try { try {
await apiClient.excludeModel(pendingExcludePath); await getModelApiClient().excludeModel(pendingExcludePath);
closeExcludeModal(); closeExcludeModal();

View File

@@ -0,0 +1,285 @@
import { DEFAULT_PRIORITY_TAG_CONFIG } from './constants.js';
const MODEL_TYPE_ALIAS_MAP = {
loras: 'lora',
lora: 'lora',
checkpoints: 'checkpoint',
checkpoint: 'checkpoint',
embeddings: 'embedding',
embedding: 'embedding',
};
function normalizeModelTypeKey(modelType) {
if (typeof modelType !== 'string') {
return '';
}
const lower = modelType.toLowerCase();
if (MODEL_TYPE_ALIAS_MAP[lower]) {
return MODEL_TYPE_ALIAS_MAP[lower];
}
if (lower.endsWith('s')) {
return lower.slice(0, -1);
}
return lower;
}
function splitPriorityEntries(raw = '') {
const segments = [];
raw.split('\n').forEach(line => {
line.split(',').forEach(part => {
const trimmed = part.trim();
if (trimmed) {
segments.push(trimmed);
}
});
});
return segments;
}
export function parsePriorityTagString(raw = '') {
const entries = [];
const rawEntries = splitPriorityEntries(raw);
rawEntries.forEach((entry) => {
const { canonical, aliases } = parsePriorityEntry(entry);
if (!canonical) {
return;
}
entries.push({ canonical, aliases });
});
return entries;
}
function parsePriorityEntry(entry) {
let canonical = entry;
let aliasSection = '';
const openIndex = entry.indexOf('(');
if (openIndex !== -1) {
if (!entry.endsWith(')')) {
canonical = entry.replace('(', '').replace(')', '');
} else {
canonical = entry.slice(0, openIndex).trim();
aliasSection = entry.slice(openIndex + 1, -1);
}
}
canonical = canonical.trim();
if (!canonical) {
return { canonical: '', aliases: [] };
}
const aliasList = aliasSection ? aliasSection.split('|').map((alias) => alias.trim()).filter(Boolean) : [];
const seen = new Set();
const normalizedCanonical = canonical.toLowerCase();
const uniqueAliases = [];
aliasList.forEach((alias) => {
const normalized = alias.toLowerCase();
if (normalized === normalizedCanonical) {
return;
}
if (!seen.has(normalized)) {
seen.add(normalized);
uniqueAliases.push(alias);
}
});
return { canonical, aliases: uniqueAliases };
}
export function formatPriorityTagEntries(entries, useNewlines = false) {
if (!entries.length) {
return '';
}
const separator = useNewlines ? ',\n' : ', ';
return entries.map(({ canonical, aliases }) => {
if (aliases && aliases.length) {
return `${canonical}(${aliases.join('|')})`;
}
return canonical;
}).join(separator);
}
export function validatePriorityTagString(raw = '') {
const trimmed = raw.trim();
if (!trimmed) {
return { valid: true, errors: [], entries: [], formatted: '' };
}
const errors = [];
const entries = [];
const rawEntries = splitPriorityEntries(raw);
const seenCanonicals = new Set();
rawEntries.forEach((entry, index) => {
const hasOpening = entry.includes('(');
const hasClosing = entry.endsWith(')');
if (hasOpening && !hasClosing) {
errors.push({ type: 'missingClosingParen', index: index + 1 });
}
const { canonical, aliases } = parsePriorityEntry(entry);
if (!canonical) {
errors.push({ type: 'missingCanonical', index: index + 1 });
return;
}
const normalizedCanonical = canonical.toLowerCase();
if (seenCanonicals.has(normalizedCanonical)) {
errors.push({ type: 'duplicateCanonical', canonical });
} else {
seenCanonicals.add(normalizedCanonical);
}
entries.push({ canonical, aliases });
});
const formatted = errors.length === 0
? formatPriorityTagEntries(entries, raw.includes('\n'))
: raw.trim();
return {
valid: errors.length === 0,
errors,
entries,
formatted,
};
}
let cachedPriorityTagMap = null;
let fetchPromise = null;
export async function getPriorityTagSuggestionsMap() {
if (cachedPriorityTagMap) {
return cachedPriorityTagMap;
}
if (!fetchPromise) {
fetchPromise = fetch('/api/lm/priority-tags')
.then(async (response) => {
if (!response.ok) {
throw new Error(`HTTP ${response.status}`);
}
const data = await response.json();
if (!data || data.success === false || typeof data.tags !== 'object') {
throw new Error(data?.error || 'Invalid response payload');
}
const normalized = {};
Object.entries(data.tags).forEach(([modelType, tags]) => {
if (!Array.isArray(tags)) {
return;
}
const key = normalizeModelTypeKey(modelType) || (typeof modelType === 'string' ? modelType.toLowerCase() : '');
if (!key) {
return;
}
const filtered = tags
.filter((tag) => typeof tag === 'string')
.map((tag) => tag.trim())
.filter(Boolean);
if (!normalized[key]) {
normalized[key] = [];
}
normalized[key].push(...filtered);
});
const withDefaults = applyDefaultPriorityTagFallback(normalized);
cachedPriorityTagMap = withDefaults;
return withDefaults;
})
.catch(() => {
const fallback = buildDefaultPriorityTagMap();
cachedPriorityTagMap = fallback;
return fallback;
})
.finally(() => {
fetchPromise = null;
});
}
return fetchPromise;
}
export async function getPriorityTagSuggestions(modelType = null) {
const map = await getPriorityTagSuggestionsMap();
if (modelType) {
const lower = typeof modelType === 'string' ? modelType.toLowerCase() : '';
const normalizedKey = normalizeModelTypeKey(modelType);
const candidates = [];
if (lower) {
candidates.push(lower);
}
if (normalizedKey && !candidates.includes(normalizedKey)) {
candidates.push(normalizedKey);
}
Object.entries(MODEL_TYPE_ALIAS_MAP).forEach(([alias, target]) => {
if (alias === lower || target === normalizedKey) {
if (!candidates.includes(target)) {
candidates.push(target);
}
}
});
for (const key of candidates) {
if (Array.isArray(map[key])) {
return [...map[key]];
}
}
return [];
}
const unique = new Set();
Object.values(map).forEach((tags) => {
tags.forEach((tag) => {
unique.add(tag);
});
});
return Array.from(unique);
}
function applyDefaultPriorityTagFallback(map) {
const result = { ...buildDefaultPriorityTagMap(), ...map };
Object.entries(result).forEach(([key, tags]) => {
result[key] = dedupeTags(Array.isArray(tags) ? tags : []);
});
return result;
}
function buildDefaultPriorityTagMap() {
const map = {};
Object.entries(DEFAULT_PRIORITY_TAG_CONFIG).forEach(([modelType, configString]) => {
const entries = parsePriorityTagString(configString);
const key = normalizeModelTypeKey(modelType) || modelType;
map[key] = entries.map((entry) => entry.canonical);
});
return map;
}
function dedupeTags(tags) {
const seen = new Set();
const ordered = [];
tags.forEach((tag) => {
const normalized = tag.toLowerCase();
if (!seen.has(normalized)) {
seen.add(normalized);
ordered.push(tag);
}
});
return ordered;
}
export function getDefaultPriorityTagConfig() {
return { ...DEFAULT_PRIORITY_TAG_CONFIG };
}
export function invalidatePriorityTagSuggestionsCache() {
cachedPriorityTagMap = null;
fetchPromise = null;
}

View File

@@ -50,6 +50,21 @@
<span>{{ t('loras.bulkOperations.selected', {'count': 0}) }}</span> <span>{{ t('loras.bulkOperations.selected', {'count': 0}) }}</span>
</div> </div>
<div class="context-menu-separator"></div> <div class="context-menu-separator"></div>
<div class="context-menu-item" data-action="refresh-all">
<i class="fas fa-sync-alt"></i> <span>{{ t('loras.bulkOperations.refreshAll') }}</span>
</div>
<div class="context-menu-item" data-action="copy-all">
<i class="fas fa-copy"></i> <span>{{ t('loras.bulkOperations.copyAll') }}</span>
</div>
<div class="context-menu-item" data-action="send-to-workflow-append">
<i class="fas fa-paper-plane"></i> <span>{{ t('loras.contextMenu.sendToWorkflowAppend') }}</span>
</div>
<div class="context-menu-item" data-action="send-to-workflow-replace">
<i class="fas fa-exchange-alt"></i> <span>{{ t('loras.contextMenu.sendToWorkflowReplace') }}</span>
</div>
<div class="context-menu-item" data-action="auto-organize">
<i class="fas fa-magic"></i> <span>{{ t('loras.bulkOperations.autoOrganize') }}</span>
</div>
<div class="context-menu-item" data-action="add-tags"> <div class="context-menu-item" data-action="add-tags">
<i class="fas fa-tags"></i> <span>{{ t('loras.bulkOperations.addTags') }}</span> <i class="fas fa-tags"></i> <span>{{ t('loras.bulkOperations.addTags') }}</span>
</div> </div>
@@ -59,32 +74,13 @@
<div class="context-menu-item" data-action="set-content-rating"> <div class="context-menu-item" data-action="set-content-rating">
<i class="fas fa-exclamation-triangle"></i> <span>{{ t('loras.bulkOperations.setContentRating') }}</span> <i class="fas fa-exclamation-triangle"></i> <span>{{ t('loras.bulkOperations.setContentRating') }}</span>
</div> </div>
<div class="context-menu-item" data-action="send-to-workflow-append"> <div class="context-menu-separator"></div>
<i class="fas fa-paper-plane"></i> <span>{{ t('loras.contextMenu.sendToWorkflowAppend') }}</span>
</div>
<div class="context-menu-item" data-action="send-to-workflow-replace">
<i class="fas fa-exchange-alt"></i> <span>{{ t('loras.contextMenu.sendToWorkflowReplace') }}</span>
</div>
<div class="context-menu-item" data-action="copy-all">
<i class="fas fa-copy"></i> <span>{{ t('loras.bulkOperations.copyAll') }}</span>
</div>
<div class="context-menu-item" data-action="refresh-all">
<i class="fas fa-sync-alt"></i> <span>{{ t('loras.bulkOperations.refreshAll') }}</span>
</div>
<div class="context-menu-item" data-action="move-all"> <div class="context-menu-item" data-action="move-all">
<i class="fas fa-folder-open"></i> <span>{{ t('loras.bulkOperations.moveAll') }}</span> <i class="fas fa-folder-open"></i> <span>{{ t('loras.bulkOperations.moveAll') }}</span>
</div> </div>
<div class="context-menu-item" data-action="auto-organize">
<i class="fas fa-magic"></i> <span>{{ t('loras.bulkOperations.autoOrganize') }}</span>
</div>
<div class="context-menu-separator"></div>
<div class="context-menu-item delete-item" data-action="delete-all"> <div class="context-menu-item delete-item" data-action="delete-all">
<i class="fas fa-trash"></i> <span>{{ t('loras.bulkOperations.deleteAll') }}</span> <i class="fas fa-trash"></i> <span>{{ t('loras.bulkOperations.deleteAll') }}</span>
</div> </div>
<div class="context-menu-separator"></div>
<div class="context-menu-item" data-action="clear">
<i class="fas fa-times"></i> <span>{{ t('loras.bulkOperations.clear') }}</span>
</div>
</div> </div>
<div id="globalContextMenu" class="context-menu"> <div id="globalContextMenu" class="context-menu">

View File

@@ -108,6 +108,12 @@
<h4><i class="fas fa-cog"></i> {{ t('help.documentation.settings') }}</h4> <h4><i class="fas fa-cog"></i> {{ t('help.documentation.settings') }}</h4>
<ul class="docs-links"> <ul class="docs-links">
<li><a href="https://github.com/willmiao/ComfyUI-Lora-Manager/wiki/Configuration" target="_blank">Configuration Options (WIP)</a></li> <li><a href="https://github.com/willmiao/ComfyUI-Lora-Manager/wiki/Configuration" target="_blank">Configuration Options (WIP)</a></li>
<li>
<a href="https://github.com/willmiao/ComfyUI-Lora-Manager/wiki/Priority-Tags-Configuration-Guide" target="_blank">
Priority Tags Configuration Guide
<span class="new-content-badge inline">{{ t('help.documentation.newBadge') }}</span>
</a>
</li>
</ul> </ul>
</div> </div>

View File

@@ -6,7 +6,7 @@
<h2>{{ t('common.actions.settings') }}</h2> <h2>{{ t('common.actions.settings') }}</h2>
<button <button
type="button" type="button"
class="settings-open-location-button" class="settings-action-link settings-open-location-trigger"
data-settings-path="{{ settings.settings_file }}" data-settings-path="{{ settings.settings_file }}"
aria-label="{{ t('settings.openSettingsFileLocation.tooltip') }}" aria-label="{{ t('settings.openSettingsFileLocation.tooltip') }}"
title="{{ t('settings.openSettingsFileLocation.tooltip') }}"> title="{{ t('settings.openSettingsFileLocation.tooltip') }}">
@@ -129,6 +129,28 @@
</div> </div>
</div> </div>
<!-- Add Model Name Display setting -->
<div class="setting-item">
<div class="setting-row">
<div class="setting-info">
<label for="modelNameDisplay">{{ t('settings.layoutSettings.modelNameDisplay') }}</label>
</div>
<div class="setting-control select-control">
<select id="modelNameDisplay" onchange="settingsManager.saveSelectSetting('modelNameDisplay', 'model_name_display')">
<option value="model_name">{{ t('settings.layoutSettings.modelNameDisplayOptions.modelName') }}</option>
<option value="file_name">{{ t('settings.layoutSettings.modelNameDisplayOptions.fileName') }}</option>
</select>
</div>
</div>
<div class="input-help">
{{ t('settings.layoutSettings.modelNameDisplayHelp') }}
<ul class="list-description">
<li><strong>{{ t('settings.layoutSettings.modelNameDisplayOptions.modelName') }}:</strong> {{ t('settings.layoutSettings.modelNameDisplayDetails.modelName') }}</li>
<li><strong>{{ t('settings.layoutSettings.modelNameDisplayOptions.fileName') }}:</strong> {{ t('settings.layoutSettings.modelNameDisplayDetails.fileName') }}</li>
</ul>
</div>
</div>
<!-- Add Card Info Display setting --> <!-- Add Card Info Display setting -->
<div class="setting-item"> <div class="setting-item">
<div class="setting-row"> <div class="setting-row">
@@ -370,6 +392,46 @@
</div> </div>
</div> </div>
<div class="setting-item priority-tags-item">
<div class="setting-row priority-tags-header">
<div class="setting-info">
<label>{{ t('settings.priorityTags.title') }}</label>
</div>
<div class="setting-control priority-tags-actions">
<a class="settings-action-link priority-tags-help-link" href="https://github.com/willmiao/ComfyUI-Lora-Manager/wiki/Priority-Tags-Configuration-Guide" target="_blank" rel="noopener" aria-label="{{ t('settings.priorityTags.helpLinkLabel') }}" title="{{ t('settings.priorityTags.helpLinkLabel') }}">
<i class="fas fa-question-circle" aria-hidden="true"></i>
</a>
</div>
</div>
<div class="input-help">{{ t('settings.priorityTags.description') }}</div>
<div class="priority-tags-tabs">
<input type="radio" id="priority-tags-tab-lora" name="priority-tags-tab" class="priority-tags-tab-input" checked>
<input type="radio" id="priority-tags-tab-checkpoint" name="priority-tags-tab" class="priority-tags-tab-input">
<input type="radio" id="priority-tags-tab-embedding" name="priority-tags-tab" class="priority-tags-tab-input">
<div class="priority-tags-tablist">
<label class="priority-tags-tab-label" for="priority-tags-tab-lora" id="priority-tags-tab-lora-label">{{ t('settings.priorityTags.modelTypes.lora') }}</label>
<label class="priority-tags-tab-label" for="priority-tags-tab-checkpoint" id="priority-tags-tab-checkpoint-label">{{ t('settings.priorityTags.modelTypes.checkpoint') }}</label>
<label class="priority-tags-tab-label" for="priority-tags-tab-embedding" id="priority-tags-tab-embedding-label">{{ t('settings.priorityTags.modelTypes.embedding') }}</label>
</div>
<div class="priority-tags-panels">
<div class="priority-tags-panel" id="priority-tags-panel-lora" aria-labelledby="priority-tags-tab-lora-label">
<textarea id="loraPriorityTagsInput" class="priority-tags-input" rows="3" placeholder="{{ t('settings.priorityTags.placeholder') }}"></textarea>
<div class="settings-input-error-message" id="loraPriorityTagsError"></div>
</div>
<div class="priority-tags-panel" id="priority-tags-panel-checkpoint" aria-labelledby="priority-tags-tab-checkpoint-label">
<textarea id="checkpointPriorityTagsInput" class="priority-tags-input" rows="3" placeholder="{{ t('settings.priorityTags.placeholder') }}"></textarea>
<div class="settings-input-error-message" id="checkpointPriorityTagsError"></div>
</div>
<div class="priority-tags-panel" id="priority-tags-panel-embedding" aria-labelledby="priority-tags-tab-embedding-label">
<textarea id="embeddingPriorityTagsInput" class="priority-tags-input" rows="3" placeholder="{{ t('settings.priorityTags.placeholder') }}"></textarea>
<div class="settings-input-error-message" id="embeddingPriorityTagsError"></div>
</div>
</div>
</div>
</div>
<!-- Add Example Images Settings Section --> <!-- Add Example Images Settings Section -->
<div class="settings-section"> <div class="settings-section">
<h3>{{ t('settings.sections.exampleImages') }}</h3> <h3>{{ t('settings.sections.exampleImages') }}</h3>

View File

@@ -13,6 +13,9 @@
</svg> </svg>
<span class="progress-percent"></span> <span class="progress-percent"></span>
</button> </button>
<button id="stopExampleDownloadBtn" class="icon-button">
<i class="fas fa-stop"></i>
</button>
<button id="collapseProgressBtn" class="icon-button"> <button id="collapseProgressBtn" class="icon-button">
<i class="fas fa-chevron-down"></i> <i class="fas fa-chevron-down"></i>
</button> </button>

View File

@@ -10,7 +10,7 @@ const {
API_MODULE: new URL('../../../scripts/api.js', import.meta.url).pathname, API_MODULE: new URL('../../../scripts/api.js', import.meta.url).pathname,
APP_MODULE: new URL('../../../scripts/app.js', import.meta.url).pathname, APP_MODULE: new URL('../../../scripts/app.js', import.meta.url).pathname,
CARET_HELPER_MODULE: new URL('../../../web/comfyui/textarea_caret_helper.js', import.meta.url).pathname, CARET_HELPER_MODULE: new URL('../../../web/comfyui/textarea_caret_helper.js', import.meta.url).pathname,
PREVIEW_COMPONENT_MODULE: new URL('../../../web/comfyui/loras_widget_components.js', import.meta.url).pathname, PREVIEW_COMPONENT_MODULE: new URL('../../../web/comfyui/preview_tooltip.js', import.meta.url).pathname,
AUTOCOMPLETE_MODULE: new URL('../../../web/comfyui/autocomplete.js', import.meta.url).pathname, AUTOCOMPLETE_MODULE: new URL('../../../web/comfyui/autocomplete.js', import.meta.url).pathname,
})); }));

View File

@@ -37,6 +37,11 @@ vi.mock('../../../static/js/utils/constants.js', () => ({
DEFAULT_PATH_TEMPLATES: {}, DEFAULT_PATH_TEMPLATES: {},
MAPPABLE_BASE_MODELS: [], MAPPABLE_BASE_MODELS: [],
PATH_TEMPLATE_PLACEHOLDERS: {}, PATH_TEMPLATE_PLACEHOLDERS: {},
DEFAULT_PRIORITY_TAG_CONFIG: {
lora: 'character, style',
checkpoint: 'base, guide',
embedding: 'hint',
},
})); }));
vi.mock('../../../static/js/utils/i18nHelpers.js', () => ({ vi.mock('../../../static/js/utils/i18nHelpers.js', () => ({

View File

@@ -0,0 +1,100 @@
import { describe, it, expect, beforeEach, afterEach, vi } from 'vitest';
import { DEFAULT_PRIORITY_TAG_CONFIG } from '../../../static/js/utils/constants.js';
const MODULE_PATH = '../../../static/js/utils/priorityTagHelpers.js';
let originalFetch;
let invalidateCacheFn;
beforeEach(() => {
originalFetch = global.fetch;
invalidateCacheFn = null;
vi.resetModules();
});
afterEach(() => {
if (invalidateCacheFn) {
invalidateCacheFn();
invalidateCacheFn = null;
}
if (originalFetch === undefined) {
delete global.fetch;
} else {
global.fetch = originalFetch;
}
vi.restoreAllMocks();
});
describe('priorityTagHelpers suggestion handling', () => {
it('returns trimmed, deduplicated suggestions scoped to the requested model type', async () => {
const fetchMock = vi.fn().mockResolvedValue({
ok: true,
json: async () => ({
success: true,
tags: {
loras: ['character', 'style ', 'style'],
checkpoints: ['Base ', 'Primary'],
},
}),
});
vi.stubGlobal('fetch', fetchMock);
const module = await import(MODULE_PATH);
invalidateCacheFn = module.invalidatePriorityTagSuggestionsCache;
const loraTags = await module.getPriorityTagSuggestions('loras');
expect(loraTags).toEqual(['character', 'style']);
const checkpointTags = await module.getPriorityTagSuggestions('CHECKPOINT');
expect(checkpointTags).toEqual(['Base', 'Primary']);
const aliasTags = await module.getPriorityTagSuggestions('lora');
expect(aliasTags).toEqual(['character', 'style']);
const defaultEmbedding = module
.parsePriorityTagString(DEFAULT_PRIORITY_TAG_CONFIG.embedding)
.map((entry) => entry.canonical);
const embeddingTags = await module.getPriorityTagSuggestions('embeddings');
expect(embeddingTags).toEqual(defaultEmbedding);
expect(fetchMock).toHaveBeenCalledTimes(1);
});
it('returns a unique union of suggestions when no model type is provided', async () => {
const fetchMock = vi.fn().mockResolvedValue({
ok: true,
json: async () => ({
success: true,
tags: {
lora: ['primary', 'support'],
checkpoint: ['guide', 'primary'],
embeddings: ['hint'],
},
}),
});
vi.stubGlobal('fetch', fetchMock);
const module = await import(MODULE_PATH);
invalidateCacheFn = module.invalidatePriorityTagSuggestionsCache;
const suggestions = await module.getPriorityTagSuggestions();
expect(suggestions).toEqual(['primary', 'support', 'guide', 'hint']);
});
it('falls back to default configuration when fetching suggestions fails', async () => {
const fetchMock = vi.fn().mockRejectedValue(new Error('network error'));
vi.stubGlobal('fetch', fetchMock);
const module = await import(MODULE_PATH);
invalidateCacheFn = module.invalidatePriorityTagSuggestionsCache;
const expected = module
.parsePriorityTagString(DEFAULT_PRIORITY_TAG_CONFIG.lora)
.map((entry) => entry.canonical);
const result = await module.getPriorityTagSuggestions('loras');
expect(result).toEqual(expected);
});
});

View File

@@ -5,6 +5,8 @@ import sys
from pathlib import Path from pathlib import Path
import types import types
from dataclasses import dataclass, field
from typing import Optional
folder_paths_stub = types.SimpleNamespace(get_folder_paths=lambda *_: []) folder_paths_stub = types.SimpleNamespace(get_folder_paths=lambda *_: [])
sys.modules.setdefault("folder_paths", folder_paths_stub) sys.modules.setdefault("folder_paths", folder_paths_stub)
@@ -16,6 +18,7 @@ from aiohttp.test_utils import TestClient, TestServer
from py.config import config from py.config import config
from py.routes.base_model_routes import BaseModelRoutes from py.routes.base_model_routes import BaseModelRoutes
from py.services import model_file_service from py.services import model_file_service
from py.services.downloader import DownloadProgress
from py.services.metadata_sync_service import MetadataSyncService from py.services.metadata_sync_service import MetadataSyncService
from py.services.model_file_service import AutoOrganizeResult from py.services.model_file_service import AutoOrganizeResult
from py.services.service_registry import ServiceRegistry from py.services.service_registry import ServiceRegistry
@@ -30,6 +33,41 @@ class DummyRoutes(BaseModelRoutes):
def setup_specific_routes(self, registrar, prefix: str) -> None: # pragma: no cover - no extra routes in smoke tests def setup_specific_routes(self, registrar, prefix: str) -> None: # pragma: no cover - no extra routes in smoke tests
return None return None
def __init__(self, service=None):
super().__init__(service)
self.set_model_update_service(NullModelUpdateService())
@dataclass
class NullUpdateRecord:
model_type: str
model_id: int
largest_version_id: int | None = None
version_ids: list[int] = field(default_factory=list)
in_library_version_ids: list[int] = field(default_factory=list)
last_checked_at: float | None = None
should_ignore: bool = False
def has_update(self) -> bool:
return False
class NullModelUpdateService:
async def refresh_for_model_type(self, *args, **kwargs):
return {}
async def refresh_single_model(self, *args, **kwargs):
return None
async def update_in_library_versions(self, model_type, model_id, version_ids):
return NullUpdateRecord(model_type=model_type, model_id=model_id, in_library_version_ids=list(version_ids))
async def set_should_ignore(self, model_type, model_id, should_ignore):
return NullUpdateRecord(model_type=model_type, model_id=model_id, should_ignore=should_ignore)
async def get_record(self, *args, **kwargs):
return None
async def create_test_client(service) -> TestClient: async def create_test_client(service) -> TestClient:
routes = DummyRoutes(service) routes = DummyRoutes(service)
@@ -59,12 +97,21 @@ def download_manager_stub():
self.error = None self.error = None
self.cancelled = [] self.cancelled = []
self.active_downloads = {} self.active_downloads = {}
self.last_progress_snapshot: Optional[DownloadProgress] = None
async def download_from_civitai(self, **kwargs): async def download_from_civitai(self, **kwargs):
self.calls.append(kwargs) self.calls.append(kwargs)
if self.error is not None: if self.error is not None:
raise self.error raise self.error
await kwargs["progress_callback"](42) snapshot = DownloadProgress(
percent_complete=50.0,
bytes_downloaded=5120,
total_bytes=10240,
bytes_per_second=2048.0,
timestamp=0.0,
)
self.last_progress_snapshot = snapshot
await kwargs["progress_callback"](snapshot)
return {"success": True, "path": "/tmp/model.safetensors"} return {"success": True, "path": "/tmp/model.safetensors"}
async def cancel_download(self, download_id): async def cancel_download(self, download_id):
@@ -332,7 +379,11 @@ def test_download_model_invokes_download_manager(
assert call_args["download_id"] == payload["download_id"] assert call_args["download_id"] == payload["download_id"]
progress = ws_manager.get_download_progress(payload["download_id"]) progress = ws_manager.get_download_progress(payload["download_id"])
assert progress is not None assert progress is not None
assert progress["progress"] == 42 expected_progress = round(download_manager_stub.last_progress_snapshot.percent_complete)
assert progress["progress"] == expected_progress
assert progress["bytes_downloaded"] == download_manager_stub.last_progress_snapshot.bytes_downloaded
assert progress["total_bytes"] == download_manager_stub.last_progress_snapshot.total_bytes
assert progress["bytes_per_second"] == download_manager_stub.last_progress_snapshot.bytes_per_second
assert "timestamp" in progress assert "timestamp" in progress
progress_response = await client.get( progress_response = await client.get(
@@ -341,7 +392,13 @@ def test_download_model_invokes_download_manager(
progress_payload = await progress_response.json() progress_payload = await progress_response.json()
assert progress_response.status == 200 assert progress_response.status == 200
assert progress_payload == {"success": True, "progress": 42} assert progress_payload == {
"success": True,
"progress": expected_progress,
"bytes_downloaded": download_manager_stub.last_progress_snapshot.bytes_downloaded,
"total_bytes": download_manager_stub.last_progress_snapshot.total_bytes,
"bytes_per_second": download_manager_stub.last_progress_snapshot.bytes_per_second,
}
ws_manager.cleanup_download_progress(payload["download_id"]) ws_manager.cleanup_download_progress(payload["download_id"])
finally: finally:
await client.close() await client.close()

View File

@@ -41,9 +41,11 @@ class StubDownloadManager:
def __init__(self) -> None: def __init__(self) -> None:
self.pause_calls = 0 self.pause_calls = 0
self.resume_calls = 0 self.resume_calls = 0
self.stop_calls = 0
self.force_payloads: list[dict[str, Any]] = [] self.force_payloads: list[dict[str, Any]] = []
self.pause_error: Exception | None = None self.pause_error: Exception | None = None
self.resume_error: Exception | None = None self.resume_error: Exception | None = None
self.stop_error: Exception | None = None
self.force_error: Exception | None = None self.force_error: Exception | None = None
async def get_status(self, request: web.Request) -> dict[str, Any]: async def get_status(self, request: web.Request) -> dict[str, Any]:
@@ -61,6 +63,12 @@ class StubDownloadManager:
raise self.resume_error raise self.resume_error
return {"success": True, "message": "resumed"} return {"success": True, "message": "resumed"}
async def stop_download(self, request: web.Request) -> dict[str, Any]:
self.stop_calls += 1
if self.stop_error:
raise self.stop_error
return {"success": True, "message": "stopping"}
async def start_force_download(self, payload: dict[str, Any]) -> dict[str, Any]: async def start_force_download(self, payload: dict[str, Any]) -> dict[str, Any]:
self.force_payloads.append(payload) self.force_payloads.append(payload)
if self.force_error: if self.force_error:
@@ -193,17 +201,22 @@ async def test_pause_and_resume_return_client_errors_when_not_running():
async with registrar_app() as harness: async with registrar_app() as harness:
harness.download_manager.pause_error = DownloadNotRunningError() harness.download_manager.pause_error = DownloadNotRunningError()
harness.download_manager.resume_error = DownloadNotRunningError("Stopped") harness.download_manager.resume_error = DownloadNotRunningError("Stopped")
harness.download_manager.stop_error = DownloadNotRunningError("Not running")
pause_response = await harness.client.post("/api/lm/pause-example-images") pause_response = await harness.client.post("/api/lm/pause-example-images")
resume_response = await harness.client.post("/api/lm/resume-example-images") resume_response = await harness.client.post("/api/lm/resume-example-images")
stop_response = await harness.client.post("/api/lm/stop-example-images")
assert pause_response.status == 400 assert pause_response.status == 400
assert resume_response.status == 400 assert resume_response.status == 400
assert stop_response.status == 400
pause_body = await _json(pause_response) pause_body = await _json(pause_response)
resume_body = await _json(resume_response) resume_body = await _json(resume_response)
stop_body = await _json(stop_response)
assert pause_body == {"success": False, "error": "No download in progress"} assert pause_body == {"success": False, "error": "No download in progress"}
assert resume_body == {"success": False, "error": "Stopped"} assert resume_body == {"success": False, "error": "Stopped"}
assert stop_body == {"success": False, "error": "Not running"}
async def test_import_route_returns_validation_errors(): async def test_import_route_returns_validation_errors():

View File

@@ -51,6 +51,10 @@ class StubDownloadManager:
self.calls.append(("resume_download", None)) self.calls.append(("resume_download", None))
return {"operation": "resume_download"} return {"operation": "resume_download"}
async def stop_download(self, request: web.Request) -> dict:
self.calls.append(("stop_download", None))
return {"operation": "stop_download"}
async def start_force_download(self, payload: Any) -> dict: async def start_force_download(self, payload: Any) -> dict:
self.calls.append(("start_force_download", payload)) self.calls.append(("start_force_download", payload))
return {"operation": "start_force_download", "payload": payload} return {"operation": "start_force_download", "payload": payload}
@@ -195,19 +199,23 @@ async def test_status_route_returns_manager_payload():
assert harness.download_manager.calls == [("get_status", {"detail": "true"})] assert harness.download_manager.calls == [("get_status", {"detail": "true"})]
async def test_pause_and_resume_routes_delegate(): async def test_pause_resume_and_stop_routes_delegate():
async with example_images_app() as harness: async with example_images_app() as harness:
pause_response = await harness.client.post("/api/lm/pause-example-images") pause_response = await harness.client.post("/api/lm/pause-example-images")
resume_response = await harness.client.post("/api/lm/resume-example-images") resume_response = await harness.client.post("/api/lm/resume-example-images")
stop_response = await harness.client.post("/api/lm/stop-example-images")
assert pause_response.status == 200 assert pause_response.status == 200
assert await pause_response.json() == {"operation": "pause_download"} assert await pause_response.json() == {"operation": "pause_download"}
assert resume_response.status == 200 assert resume_response.status == 200
assert await resume_response.json() == {"operation": "resume_download"} assert await resume_response.json() == {"operation": "resume_download"}
assert stop_response.status == 200
assert await stop_response.json() == {"operation": "stop_download"}
assert harness.download_manager.calls[-2:] == [ assert harness.download_manager.calls[-3:] == [
("pause_download", None), ("pause_download", None),
("resume_download", None), ("resume_download", None),
("stop_download", None),
] ]
@@ -309,6 +317,10 @@ async def test_download_handler_methods_delegate() -> None:
self.calls.append(("resume_download", request)) self.calls.append(("resume_download", request))
return {"status": "running"} return {"status": "running"}
async def stop_download(self, request) -> dict:
self.calls.append(("stop_download", request))
return {"status": "stopping"}
async def start_force_download(self, payload) -> dict: async def start_force_download(self, payload) -> dict:
self.calls.append(("start_force_download", payload)) self.calls.append(("start_force_download", payload))
return {"status": "force", "payload": payload} return {"status": "force", "payload": payload}
@@ -342,6 +354,8 @@ async def test_download_handler_methods_delegate() -> None:
assert json.loads(pause_response.text) == {"status": "paused"} assert json.loads(pause_response.text) == {"status": "paused"}
resume_response = await handler.resume_example_images(request) resume_response = await handler.resume_example_images(request)
assert json.loads(resume_response.text) == {"status": "running"} assert json.loads(resume_response.text) == {"status": "running"}
stop_response = await handler.stop_example_images(request)
assert json.loads(stop_response.text) == {"status": "stopping"}
force_response = await handler.force_download_example_images(request) force_response = await handler.force_download_example_images(request)
assert json.loads(force_response.text) == {"status": "force", "payload": {"foo": "bar"}} assert json.loads(force_response.text) == {"status": "force", "payload": {"foo": "bar"}}
@@ -350,6 +364,7 @@ async def test_download_handler_methods_delegate() -> None:
("get_status", request), ("get_status", request),
("pause_download", request), ("pause_download", request),
("resume_download", request), ("resume_download", request),
("stop_download", request),
("start_force_download", {"foo": "bar"}), ("start_force_download", {"foo": "bar"}),
] ]
@@ -460,6 +475,7 @@ def test_handler_set_route_mapping_includes_all_handlers() -> None:
"get_example_images_status", "get_example_images_status",
"pause_example_images", "pause_example_images",
"resume_example_images", "resume_example_images",
"stop_example_images",
"force_download_example_images", "force_download_example_images",
"import_example_images", "import_example_images",
"delete_example_image", "delete_example_image",

View File

@@ -67,6 +67,19 @@ class StubSearchStrategy:
return list(self.search_result) return list(self.search_result)
class StubUpdateService:
def __init__(self, decisions):
self.decisions = dict(decisions)
self.calls = []
async def has_update(self, model_type, model_id):
self.calls.append((model_type, model_id))
result = self.decisions.get(model_id, False)
if isinstance(result, Exception):
raise result
return result
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_get_paginated_data_uses_injected_collaborators(): async def test_get_paginated_data_uses_injected_collaborators():
data = [ data = [
@@ -272,3 +285,111 @@ async def test_get_paginated_data_paginates_without_search():
assert response["page"] == 2 assert response["page"] == 2
assert response["page_size"] == 2 assert response["page_size"] == 2
assert response["total_pages"] == 3 assert response["total_pages"] == 3
@pytest.mark.asyncio
async def test_get_paginated_data_filters_by_update_status():
items = [
{"model_name": "A", "civitai": {"modelId": 1}},
{"model_name": "B", "civitai": {"modelId": 2}},
{"model_name": "C", "civitai": {"modelId": 3}},
]
repository = StubRepository(items)
filter_set = PassThroughFilterSet()
search_strategy = NoSearchStrategy()
update_service = StubUpdateService({1: True, 2: False, 3: True})
settings = StubSettings({})
service = DummyService(
model_type="stub",
scanner=object(),
metadata_class=BaseModelMetadata,
cache_repository=repository,
filter_set=filter_set,
search_strategy=search_strategy,
settings_provider=settings,
update_service=update_service,
)
response = await service.get_paginated_data(
page=1,
page_size=5,
sort_by="name:asc",
has_update=True,
)
assert update_service.calls == [("stub", 1), ("stub", 2), ("stub", 3)]
assert response["items"] == [items[0], items[2]]
assert response["total"] == 2
assert response["page"] == 1
assert response["page_size"] == 5
assert response["total_pages"] == 1
@pytest.mark.asyncio
async def test_get_paginated_data_has_update_without_service_returns_empty():
items = [
{"model_name": "A", "civitai": {"modelId": 1}},
{"model_name": "B", "civitai": {"modelId": 2}},
]
repository = StubRepository(items)
filter_set = PassThroughFilterSet()
search_strategy = NoSearchStrategy()
settings = StubSettings({})
service = DummyService(
model_type="stub",
scanner=object(),
metadata_class=BaseModelMetadata,
cache_repository=repository,
filter_set=filter_set,
search_strategy=search_strategy,
settings_provider=settings,
)
response = await service.get_paginated_data(
page=1,
page_size=10,
sort_by="name:asc",
has_update=True,
)
assert response["items"] == []
assert response["total"] == 0
assert response["total_pages"] == 0
@pytest.mark.asyncio
async def test_get_paginated_data_skips_items_when_update_check_fails():
items = [
{"model_name": "A", "civitai": {"modelId": 1}},
{"model_name": "B", "civitai": {"modelId": 2}},
]
repository = StubRepository(items)
filter_set = PassThroughFilterSet()
search_strategy = NoSearchStrategy()
update_service = StubUpdateService({1: True, 2: RuntimeError("boom")})
settings = StubSettings({})
service = DummyService(
model_type="stub",
scanner=object(),
metadata_class=BaseModelMetadata,
cache_repository=repository,
filter_set=filter_set,
search_strategy=search_strategy,
settings_provider=settings,
update_service=update_service,
)
response = await service.get_paginated_data(
page=1,
page_size=10,
sort_by="name:asc",
has_update=True,
)
assert update_service.calls == [("stub", 1), ("stub", 2)]
assert response["items"] == [items[0]]
assert response["total"] == 1
assert response["total_pages"] == 1

View File

@@ -0,0 +1,255 @@
import copy
from unittest.mock import AsyncMock
import pytest
from py.services import civarchive_client as civarchive_client_module
from py.services.civarchive_client import CivArchiveClient
from py.services.errors import RateLimitError
from py.services.model_metadata_provider import ModelMetadataProviderManager
class DummyDownloader:
def __init__(self):
self.calls = []
async def make_request(self, method, url, use_auth=False, **kwargs):
self.calls.append({"method": method, "url": url, "params": kwargs.get("params")})
return True, {}
@pytest.fixture(autouse=True)
def reset_singletons():
CivArchiveClient._instance = None
ModelMetadataProviderManager._instance = None
yield
CivArchiveClient._instance = None
ModelMetadataProviderManager._instance = None
@pytest.fixture
def downloader(monkeypatch):
instance = DummyDownloader()
monkeypatch.setattr(civarchive_client_module, "get_downloader", AsyncMock(return_value=instance))
return instance
def _base_civarchive_payload(version_id=1976567, *, trigger="mxpln", nsfw_level=31):
version_name = "v2.0" if version_id != 1976567 else "v1.0"
file_sha = "e2b7a280d6539556f23f380b3f71e4e22bc4524445c4c96526e117c6005c6ad3"
return {
"data": {
"id": 1746460,
"name": "Mixplin Style [Illustrious]",
"type": "LORA",
"description": "description",
"is_nsfw": True,
"nsfw_level": nsfw_level,
"tags": ["art", "style"],
"creator_username": "Ty_Lee",
"creator_name": "Ty_Lee",
"creator_url": "/users/Ty_Lee",
"version": {
"id": version_id,
"modelId": 1746460,
"name": version_name,
"baseModel": "Illustrious",
"description": "version description",
"downloadCount": 437,
"ratingCount": 0,
"rating": 0,
"nsfw_level": nsfw_level,
"trigger": [trigger],
"files": [
{
"id": 1874043,
"name": "mxpln-illustrious-ty_lee.safetensors",
"type": "Model",
"sizeKB": 223124.37109375,
"downloadUrl": "https://civitai.com/api/download/models/1976567",
"sha256": file_sha,
"is_primary": False,
"mirrors": [
{
"filename": "mxpln-illustrious-ty_lee.safetensors",
"url": "https://civitai.com/api/download/models/1976567",
"deletedAt": None,
}
],
}
],
"images": [
{
"id": 86403595,
"url": "https://img.genur.art/example.png",
"nsfwLevel": 1,
}
],
},
"versions": [
{"id": 2042594, "name": "v2.0"},
{"id": 1976567, "name": "v1.0"},
],
}
}
async def test_get_model_by_hash_transforms_payload(downloader):
payload = _base_civarchive_payload()
async def fake_make_request(method, url, use_auth=False, **kwargs):
downloader.calls.append({"url": url, "params": kwargs.get("params")})
if url.endswith("/sha256/abc"):
return True, copy.deepcopy(payload)
return False, "unexpected"
downloader.make_request = fake_make_request
client = await CivArchiveClient.get_instance()
result, error = await client.get_model_by_hash("abc")
assert error is None
assert result["id"] == 1976567
assert result["nsfwLevel"] == 31
assert result["trainedWords"] == ["mxpln"]
assert result["stats"] == {"downloadCount": 437, "ratingCount": 0, "rating": 0}
assert result["model"]["name"] == "Mixplin Style [Illustrious]"
assert result["model"]["nsfw"] is True
assert result["creator"]["username"] == "Ty_Lee"
assert result["creator"]["image"] == ""
file_meta = result["files"][0]
assert file_meta["hashes"]["SHA256"] == "E2B7A280D6539556F23F380B3F71E4E22BC4524445C4C96526E117C6005C6AD3"
assert file_meta["mirrors"][0]["url"] == "https://civitai.com/api/download/models/1976567"
assert file_meta["primary"] is True
assert result["source"] == "civarchive"
assert result["images"][0]["url"] == "https://img.genur.art/example.png"
async def test_get_model_versions_fetches_each_version(downloader):
base_url = "https://civarchive.com/api/models/1746460"
base_payload = _base_civarchive_payload(version_id=2042594, trigger="mxpln-new", nsfw_level=5)
other_payload = _base_civarchive_payload()
responses = {
(base_url, None): base_payload,
(base_url, (("modelVersionId", "2042594"),)): base_payload,
(base_url, (("modelVersionId", "1976567"),)): other_payload,
}
async def fake_make_request(method, url, use_auth=False, **kwargs):
params = kwargs.get("params")
key = (url, tuple(sorted((params or {}).items())) if params else None)
downloader.calls.append({"url": url, "params": params})
if key in responses:
return True, copy.deepcopy(responses[key])
return False, "unexpected"
downloader.make_request = fake_make_request
client = await CivArchiveClient.get_instance()
result = await client.get_model_versions("1746460")
assert result["name"] == "Mixplin Style [Illustrious]"
assert result["type"] == "LORA"
versions = result["modelVersions"]
assert [version["id"] for version in versions] == [2042594, 1976567]
assert versions[0]["trainedWords"] == ["mxpln-new"]
assert versions[1]["trainedWords"] == ["mxpln"]
assert versions[0]["nsfwLevel"] == 5
assert versions[1]["nsfwLevel"] == 31
assert any(call["params"] == {"modelVersionId": "2042594"} for call in downloader.calls)
assert any(call["params"] == {"modelVersionId": "1976567"} for call in downloader.calls)
async def test_get_model_version_redirects_to_actual_model_id(downloader):
first_payload = _base_civarchive_payload()
first_payload["data"]["version"]["modelId"] = 222
base_url_request = "https://civarchive.com/api/models/111"
redirected_url_request = "https://civarchive.com/api/models/222"
async def fake_make_request(method, url, use_auth=False, **kwargs):
downloader.calls.append({"url": url, "params": kwargs.get("params")})
params = kwargs.get("params") or {}
if url == base_url_request:
return True, copy.deepcopy(first_payload)
if url == redirected_url_request and params.get("modelVersionId") == "1976567":
return True, copy.deepcopy(_base_civarchive_payload())
return False, "unexpected"
downloader.make_request = fake_make_request
client = await CivArchiveClient.get_instance()
result = await client.get_model_version(model_id=111, version_id=1976567)
assert result is not None
assert result["model"]["name"] == "Mixplin Style [Illustrious]"
assert len(downloader.calls) == 2
assert downloader.calls[1]["url"] == redirected_url_request
async def test_get_model_by_hash_uses_file_fallback(downloader, monkeypatch):
file_only_payload = {
"data": {
"files": [
{
"model_id": 1746460,
"model_version_id": 1976567,
"source": "civitai",
}
]
}
}
version_payload = _base_civarchive_payload()
async def fake_make_request(method, url, use_auth=False, **kwargs):
downloader.calls.append({"url": url, "params": kwargs.get("params")})
if "/sha256/" in url:
return True, copy.deepcopy(file_only_payload)
if "/models/1746460" in url:
return True, copy.deepcopy(version_payload)
return False, "unexpected"
downloader.make_request = fake_make_request
client = await CivArchiveClient.get_instance()
result, error = await client.get_model_by_hash("fallback")
assert error is None
assert result["id"] == 1976567
assert result["model"]["name"] == "Mixplin Style [Illustrious]"
assert any("/models/1746460" in call["url"] for call in downloader.calls)
async def test_get_model_by_hash_handles_not_found(downloader):
async def fake_make_request(method, url, use_auth=False, **kwargs):
return False, "Resource not found"
downloader.make_request = fake_make_request
client = await CivArchiveClient.get_instance()
result, error = await client.get_model_by_hash("missing")
assert result is None
assert error == "Model not found"
async def test_get_model_by_hash_propagates_rate_limit(downloader):
async def fake_make_request(method, url, use_auth=False, **kwargs):
return False, RateLimitError("limited", retry_after=5)
downloader.make_request = fake_make_request
client = await CivArchiveClient.get_instance()
with pytest.raises(RateLimitError) as exc_info:
await client.get_model_by_hash("limited")
assert exc_info.value.retry_after == 5
assert exc_info.value.provider == "civarchive_api"

View File

@@ -5,6 +5,7 @@ import pytest
from py.services import civitai_client as civitai_client_module from py.services import civitai_client as civitai_client_module
from py.services.civitai_client import CivitaiClient from py.services.civitai_client import CivitaiClient
from py.services.errors import RateLimitError
from py.services.model_metadata_provider import ModelMetadataProviderManager from py.services.model_metadata_provider import ModelMetadataProviderManager
@@ -106,6 +107,21 @@ async def test_get_model_by_hash_handles_not_found(monkeypatch, downloader):
assert error == "Model not found" assert error == "Model not found"
async def test_get_model_by_hash_propagates_rate_limit(monkeypatch, downloader):
async def fake_make_request(method, url, use_auth=True):
return False, RateLimitError("limited", retry_after=4)
downloader.make_request = fake_make_request
client = await CivitaiClient.get_instance()
with pytest.raises(RateLimitError) as exc_info:
await client.get_model_by_hash("limited")
assert exc_info.value.retry_after == 4
assert exc_info.value.provider == "civitai_api"
async def test_download_preview_image_writes_file(tmp_path, downloader): async def test_download_preview_image_writes_file(tmp_path, downloader):
client = await CivitaiClient.get_instance() client = await CivitaiClient.get_instance()
target = tmp_path / "preview" / "image.jpg" target = tmp_path / "preview" / "image.jpg"

View File

@@ -0,0 +1,117 @@
from __future__ import annotations
from dataclasses import dataclass, field
from typing import Any, Dict, List, Tuple
from unittest.mock import AsyncMock
from py.services.download_coordinator import DownloadCoordinator
@dataclass
class StubWebSocketManager:
progress: Dict[str, Dict[str, Any]] = field(default_factory=dict)
broadcasts: List[Tuple[str, Dict[str, Any]]] = field(default_factory=list)
def generate_download_id(self) -> str:
return "generated"
def get_download_progress(self, download_id: str) -> Dict[str, Any] | None:
return self.progress.get(download_id)
async def broadcast_download_progress(self, download_id: str, payload: Dict[str, Any]) -> None:
self.broadcasts.append((download_id, payload))
async def test_pause_download_broadcasts_cached_state():
ws_manager = StubWebSocketManager(
progress={
"dl": {
"progress": 45,
"bytes_downloaded": 1024,
"total_bytes": 2048,
"bytes_per_second": 256.0,
}
}
)
download_manager = AsyncMock()
download_manager.pause_download = AsyncMock(return_value={"success": True})
async def factory():
return download_manager
coordinator = DownloadCoordinator(ws_manager=ws_manager, download_manager_factory=factory)
result = await coordinator.pause_download("dl")
assert result == {"success": True}
assert ws_manager.broadcasts == [
(
"dl",
{
"status": "paused",
"progress": 45,
"download_id": "dl",
"message": "Download paused by user",
"bytes_downloaded": 1024,
"total_bytes": 2048,
"bytes_per_second": 0.0,
},
)
]
async def test_resume_download_broadcasts_cached_state():
ws_manager = StubWebSocketManager(
progress={
"dl": {
"progress": 75,
"bytes_downloaded": 2048,
"total_bytes": 4096,
"bytes_per_second": 512.0,
}
}
)
download_manager = AsyncMock()
download_manager.resume_download = AsyncMock(return_value={"success": True})
async def factory():
return download_manager
coordinator = DownloadCoordinator(ws_manager=ws_manager, download_manager_factory=factory)
result = await coordinator.resume_download("dl")
assert result == {"success": True}
assert ws_manager.broadcasts == [
(
"dl",
{
"status": "downloading",
"progress": 75,
"download_id": "dl",
"message": "Download resumed by user",
"bytes_downloaded": 2048,
"total_bytes": 4096,
"bytes_per_second": 512.0,
},
)
]
async def test_pause_download_does_not_broadcast_on_failure():
ws_manager = StubWebSocketManager()
download_manager = AsyncMock()
download_manager.pause_download = AsyncMock(return_value={"success": False, "error": "nope"})
async def factory():
return download_manager
coordinator = DownloadCoordinator(ws_manager=ws_manager, download_manager_factory=factory)
result = await coordinator.pause_download("dl")
assert result == {"success": False, "error": "nope"}
assert ws_manager.broadcasts == []

View File

@@ -1,3 +1,4 @@
import asyncio
import os import os
from pathlib import Path from pathlib import Path
from types import SimpleNamespace from types import SimpleNamespace
@@ -108,6 +109,7 @@ def metadata_provider(monkeypatch):
"creator": {"username": "Author"}, "creator": {"username": "Author"},
"files": [ "files": [
{ {
"type": "Model",
"primary": True, "primary": True,
"downloadUrl": "https://example.invalid/file.safetensors", "downloadUrl": "https://example.invalid/file.safetensors",
"name": "file.safetensors", "name": "file.safetensors",
@@ -206,6 +208,7 @@ async def test_download_uses_active_mirrors(monkeypatch, scanners, metadata_prov
"creator": {"username": "Author"}, "creator": {"username": "Author"},
"files": [ "files": [
{ {
"type": "Model",
"primary": True, "primary": True,
"downloadUrl": "https://example.invalid/file.safetensors", "downloadUrl": "https://example.invalid/file.safetensors",
"mirrors": [ "mirrors": [
@@ -396,6 +399,67 @@ async def test_execute_download_retries_urls(monkeypatch, tmp_path):
assert dummy_scanner.calls # ensure cache updated assert dummy_scanner.calls # ensure cache updated
async def test_pause_download_updates_state():
manager = DownloadManager()
download_id = "dl"
manager._download_tasks[download_id] = object()
pause_event = asyncio.Event()
pause_event.set()
manager._pause_events[download_id] = pause_event
manager._active_downloads[download_id] = {
"status": "downloading",
"bytes_per_second": 42.0,
}
result = await manager.pause_download(download_id)
assert result == {"success": True, "message": "Download paused successfully"}
assert download_id in manager._pause_events
assert manager._pause_events[download_id].is_set() is False
assert manager._active_downloads[download_id]["status"] == "paused"
assert manager._active_downloads[download_id]["bytes_per_second"] == 0.0
async def test_pause_download_rejects_unknown_task():
manager = DownloadManager()
result = await manager.pause_download("missing")
assert result == {"success": False, "error": "Download task not found"}
async def test_resume_download_sets_event_and_status():
manager = DownloadManager()
download_id = "dl"
pause_event = asyncio.Event()
manager._pause_events[download_id] = pause_event
manager._active_downloads[download_id] = {
"status": "paused",
"bytes_per_second": 0.0,
}
result = await manager.resume_download(download_id)
assert result == {"success": True, "message": "Download resumed successfully"}
assert manager._pause_events[download_id].is_set() is True
assert manager._active_downloads[download_id]["status"] == "downloading"
async def test_resume_download_rejects_when_not_paused():
manager = DownloadManager()
download_id = "dl"
pause_event = asyncio.Event()
pause_event.set()
manager._pause_events[download_id] = pause_event
result = await manager.resume_download(download_id)
assert result == {"success": False, "error": "Download is not paused"}
@pytest.mark.asyncio @pytest.mark.asyncio
async def test_execute_download_uses_rewritten_civitai_preview(monkeypatch, tmp_path): async def test_execute_download_uses_rewritten_civitai_preview(monkeypatch, tmp_path):
manager = DownloadManager() manager = DownloadManager()

Some files were not shown because too many files have changed in this diff Show More