Compare commits

...

9 Commits

Author SHA1 Message Date
Will Miao
36e3e62e70 feat: add filter presets and update version to v0.9.15
- Added filter presets feature allowing users to save and quickly switch between filter combinations
- Fixed various bugs to improve overall stability
- Updated project version from 0.9.14 to 0.9.15 in pyproject.toml
2026-02-04 09:12:37 +08:00
Will Miao
7bcf4e4491 feat(config): discover deep symlinks dynamically when accessing previews 2026-02-04 00:16:59 +08:00
Will Miao
c12aefa82a fix(recipes): detect duplicates for remote imports using modelVersionId and Civitai URL, #750
- Use modelVersionId as fallback for all loras in fingerprint calculation (not just deleted)
- Add URL-based duplicate detection using source_path field
- Combine both fingerprint and URL-based duplicate detection in API response
- Fix _download_remote_media return type and unbound variable issue
2026-02-03 21:32:15 +08:00
Will Miao
990a3527e4 feat(ui): improve filter preset delete button visibility and layout
- Hide delete button by default and show on hover for inactive presets
- Show delete button on active presets only when hovering over the preset
- Add ellipsis truncation for long preset names to prevent layout breakage
- Remove checkmark icon from active preset names for cleaner visual design
2026-02-03 20:05:39 +08:00
Will Miao
655d3cab71 fix(config): prioritize checkpoints over unet when paths overlap, #799
When checkpoints and unet folders point to the same physical location
(via symlinks), prioritize checkpoints for backward compatibility.

This prevents the 'Failed to load Checkpoint root' error that users
experience when they have incorrectly configured their ComfyUI paths.

Changes:
- Detect overlapping real paths between checkpoints and unet
- Log warning to inform users of the configuration issue
- Remove overlapping paths from unet_map, keeping checkpoints

Fixes #<issue-number>
2026-02-03 18:27:42 +08:00
Will Miao
358e658459 fix(trigger_word_toggle): add trigger word normalization method
Introduce a new private method `_normalize_trigger_words` to handle consistent splitting and cleaning of trigger word strings. This method splits input by both single and double commas, strips whitespace, and filters out empty strings, returning a set of normalized words. It is now used in `process_trigger_words` to compare trigger word overrides, ensuring accurate detection of changes by comparing normalized sets instead of raw strings.
2026-02-03 15:42:09 +08:00
Will Miao
f28c32f2b1 feat(lora-cycler): increase repeat input width for better usability
The width of the repeat input field in the LoRA cycler settings view has been increased from 40px to 50px. This change improves usability by providing more space for user input, making the control easier to interact with and reducing visual crowding.
2026-02-03 09:40:55 +08:00
Will Miao
f5dbd6b8e8 fix(metadata): auto-disable archive_db setting when database file is missing 2026-02-03 08:36:27 +08:00
Will Miao
2c026a2646 fix(metadata-sync): persist db_checked flag for deleted models
When a deleted model is checked against the SQLite archive and not found, the `db_checked` flag was set in memory but never saved to disk. This occurred because the save operation was only triggered when `civitai_api_not_found` was True, which is not the case for deleted models (since the CivitAI API is not attempted). As a result, deleted models would be rechecked on every refresh instead of being skipped.

Changes:
- Introduce a `needs_save` flag to track when metadata state is updated
- Save metadata whenever `db_checked` is set to True, regardless of API status
- Ensure `last_checked_at` is set for SQLite-only attempts
- Add regression test to verify the fix
2026-02-03 07:34:41 +08:00
22 changed files with 933 additions and 103 deletions

View File

@@ -34,6 +34,10 @@ Enhance your Civitai browsing experience with our companion browser extension! S
## Release Notes
### v0.9.15
* **Filter Presets** - Save filter combinations as presets for quick switching and reapplication.
* **Bug Fixes** - Fixed various bugs for improved stability.
### v0.9.14
* **LoRA Cycler Node** - Introduced a new LoRA Cycler node that enables iteration through specified LoRAs with support for repeat count and pause iteration functionality. Refer to the new "Lora Cycler" template workflow for concrete example.
* **Enhanced Prompt Node with Tag Autocomplete** - Enhanced the Prompt node with comprehensive tag autocomplete based on merged Danbooru + e621 tags. Supports tag search and autocomplete functionality. Implemented a command system with shortcuts like `/char` or `/artist` for category-specific tag searching. Added `/ac` or `/noac` commands to quickly enable or disable autocomplete. Refer to the "Lora Manager Basic" template workflow in ComfyUI -> Templates -> ComfyUI-Lora-Manager for detailed tips.

View File

@@ -645,6 +645,23 @@ class Config:
checkpoint_map = self._dedupe_existing_paths(checkpoint_paths)
unet_map = self._dedupe_existing_paths(unet_paths)
# Detect when checkpoints and unet share the same physical location
# This is a configuration issue that can cause duplicate model entries
overlapping_real_paths = set(checkpoint_map.keys()) & set(unet_map.keys())
if overlapping_real_paths:
logger.warning(
"Detected overlapping paths between 'checkpoints' and 'diffusion_models' (unet). "
"They should not point to the same physical folder as they are different model types. "
"Please fix your ComfyUI path configuration to separate these folders. "
"Falling back to 'checkpoints' for backward compatibility. "
"Overlapping real paths: %s",
[checkpoint_map.get(rp, rp) for rp in overlapping_real_paths]
)
# Remove overlapping paths from unet_map to prioritize checkpoints
for rp in overlapping_real_paths:
if rp in unet_map:
del unet_map[rp]
merged_map: Dict[str, str] = {}
for real_path, original in {**checkpoint_map, **unet_map}.items():
if real_path not in merged_map:
@@ -749,7 +766,23 @@ class Config:
return f'/api/lm/previews?path={encoded_path}'
def is_preview_path_allowed(self, preview_path: str) -> bool:
"""Return ``True`` if ``preview_path`` is within an allowed directory."""
"""Return ``True`` if ``preview_path`` is within an allowed directory.
If the path is initially rejected, attempts to discover deep symlinks
that were not scanned during initialization. If a symlink is found,
updates the in-memory path mappings and retries the check.
"""
if self._is_path_in_allowed_roots(preview_path):
return True
if self._try_discover_deep_symlink(preview_path):
return self._is_path_in_allowed_roots(preview_path)
return False
def _is_path_in_allowed_roots(self, preview_path: str) -> bool:
"""Check if preview_path is within allowed preview roots without modification."""
if not preview_path:
return False
@@ -759,29 +792,72 @@ class Config:
except Exception:
return False
# Use os.path.normcase for case-insensitive comparison on Windows.
# On Windows, Path.relative_to() is case-sensitive for drive letters,
# causing paths like 'a:/folder' to not match 'A:/folder'.
candidate_str = os.path.normcase(str(candidate))
for root in self._preview_root_paths:
root_str = os.path.normcase(str(root))
# Check if candidate is equal to or under the root directory
if candidate_str == root_str or candidate_str.startswith(root_str + os.sep):
return True
if self._preview_root_paths:
logger.debug(
"Preview path rejected: %s (candidate=%s, num_roots=%d, first_root=%s)",
preview_path,
candidate_str,
len(self._preview_root_paths),
os.path.normcase(str(next(iter(self._preview_root_paths)))),
)
else:
logger.debug(
"Preview path rejected (no roots configured): %s",
preview_path,
)
logger.debug(
"Path not in allowed roots: %s (candidate=%s, num_roots=%d)",
preview_path,
candidate_str,
len(self._preview_root_paths),
)
return False
def _try_discover_deep_symlink(self, preview_path: str) -> bool:
"""Attempt to discover a deep symlink that contains the preview_path.
Walks up from the preview path to the root directories, checking each
parent directory for symlinks. If a symlink is found, updates the
in-memory path mappings and preview roots.
Only updates in-memory state (self._path_mappings and self._preview_root_paths),
does not modify the persistent cache file.
Returns:
True if a symlink was discovered and mappings updated, False otherwise.
"""
if not preview_path:
return False
try:
candidate = Path(preview_path).expanduser()
except Exception:
return False
current = candidate
while True:
try:
if self._is_link(str(current)):
try:
target = os.path.realpath(str(current))
normalized_target = self._normalize_path(target)
normalized_link = self._normalize_path(str(current))
self._path_mappings[normalized_target] = normalized_link
self._preview_root_paths.update(self._expand_preview_root(normalized_target))
self._preview_root_paths.update(self._expand_preview_root(normalized_link))
logger.debug(
"Discovered deep symlink: %s -> %s (preview path: %s)",
normalized_link,
normalized_target,
preview_path
)
return True
except OSError:
pass
except OSError:
pass
parent = current.parent
if parent == current:
break
current = parent
return False

View File

@@ -60,6 +60,22 @@ class TriggerWordToggleLM:
else:
return data
def _normalize_trigger_words(self, trigger_words):
"""Normalize trigger words by splitting by both single and double commas, stripping whitespace, and filtering empty strings"""
if not trigger_words or not isinstance(trigger_words, str):
return set()
# Split by double commas first to preserve groups, then by single commas
groups = re.split(r",{2,}", trigger_words)
words = []
for group in groups:
# Split each group by single comma
group_words = [word.strip() for word in group.split(",")]
words.extend(group_words)
# Filter out empty strings and return as set
return set(word for word in words if word)
def process_trigger_words(
self,
id,
@@ -81,7 +97,7 @@ class TriggerWordToggleLM:
if (
trigger_words_override
and isinstance(trigger_words_override, str)
and trigger_words_override != trigger_words
and self._normalize_trigger_words(trigger_words_override) != self._normalize_trigger_words(trigger_words)
):
filtered_triggers = trigger_words_override
return (filtered_triggers,)

View File

@@ -33,6 +33,10 @@ class PreviewHandler:
raise web.HTTPBadRequest(text="Invalid preview path encoding") from exc
normalized = decoded_path.replace("\\", "/")
if not self._config.is_preview_path_allowed(normalized):
raise web.HTTPForbidden(text="Preview path is not within an allowed directory")
candidate = Path(normalized)
try:
resolved = candidate.expanduser().resolve(strict=False)
@@ -40,12 +44,8 @@ class PreviewHandler:
logger.debug("Failed to resolve preview path %s: %s", normalized, exc)
raise web.HTTPBadRequest(text="Unable to resolve preview path") from exc
resolved_str = str(resolved)
if not self._config.is_preview_path_allowed(resolved_str):
raise web.HTTPForbidden(text="Preview path is not within an allowed directory")
if not resolved.is_file():
logger.debug("Preview file not found at %s", resolved_str)
logger.debug("Preview file not found at %s", str(resolved))
raise web.HTTPNotFound(text="Preview file not found")
# aiohttp's FileResponse handles range requests and content headers for us.

View File

@@ -412,10 +412,11 @@ class RecipeQueryHandler:
if recipe_scanner is None:
raise RuntimeError("Recipe scanner unavailable")
duplicate_groups = await recipe_scanner.find_all_duplicate_recipes()
fingerprint_groups = await recipe_scanner.find_all_duplicate_recipes()
url_groups = await recipe_scanner.find_duplicate_recipes_by_source()
response_data = []
for fingerprint, recipe_ids in duplicate_groups.items():
for fingerprint, recipe_ids in fingerprint_groups.items():
if len(recipe_ids) <= 1:
continue
@@ -439,12 +440,44 @@ class RecipeQueryHandler:
recipes.sort(key=lambda entry: entry.get("modified", 0), reverse=True)
response_data.append(
{
"type": "fingerprint",
"fingerprint": fingerprint,
"count": len(recipes),
"recipes": recipes,
}
)
for url, recipe_ids in url_groups.items():
if len(recipe_ids) <= 1:
continue
recipes = []
for recipe_id in recipe_ids:
recipe = await recipe_scanner.get_recipe_by_id(recipe_id)
if recipe:
recipes.append(
{
"id": recipe.get("id"),
"title": recipe.get("title"),
"file_url": recipe.get("file_url")
or self._format_recipe_file_url(recipe.get("file_path", "")),
"modified": recipe.get("modified"),
"created_date": recipe.get("created_date"),
"lora_count": len(recipe.get("loras", [])),
}
)
if len(recipes) >= 2:
recipes.sort(key=lambda entry: entry.get("modified", 0), reverse=True)
response_data.append(
{
"type": "source_url",
"fingerprint": url,
"count": len(recipes),
"recipes": recipes,
}
)
response_data.sort(key=lambda entry: entry["count"], reverse=True)
return web.json_response({"success": True, "duplicate_groups": response_data})
except Exception as exc:
@@ -1021,7 +1054,7 @@ class RecipeManagementHandler:
"exclude": False,
}
async def _download_remote_media(self, image_url: str) -> tuple[bytes, str]:
async def _download_remote_media(self, image_url: str) -> tuple[bytes, str, Any]:
civitai_client = self._civitai_client_getter()
downloader = await self._downloader_factory()
temp_path = None
@@ -1029,6 +1062,7 @@ class RecipeManagementHandler:
with tempfile.NamedTemporaryFile(delete=False) as temp_file:
temp_path = temp_file.name
download_url = image_url
image_info = None
civitai_match = re.match(r"https://civitai\.com/images/(\d+)", image_url)
if civitai_match:
if civitai_client is None:

View File

@@ -44,6 +44,8 @@ async def initialize_metadata_providers():
logger.debug(f"SQLite metadata provider registered with database: {db_path}")
else:
logger.warning("Metadata archive database is enabled but database file not found")
logger.info("Automatically disabling enable_metadata_archive_db setting")
settings_manager.set('enable_metadata_archive_db', False)
except Exception as e:
logger.error(f"Failed to initialize SQLite metadata provider: {e}")

View File

@@ -243,17 +243,27 @@ class MetadataSyncService:
last_error = error or last_error
if civitai_metadata is None or metadata_provider is None:
# Track if we need to save metadata
needs_save = False
if sqlite_attempted:
model_data["db_checked"] = True
needs_save = True
if civitai_api_not_found:
model_data["from_civitai"] = False
model_data["civitai_deleted"] = True
model_data["db_checked"] = sqlite_attempted or (enable_archive and model_data.get("db_checked", False))
model_data["last_checked_at"] = datetime.now().timestamp()
needs_save = True
# Save metadata if any state was updated
if needs_save:
data_to_save = model_data.copy()
data_to_save.pop("folder", None)
# Update last_checked_at for sqlite-only attempts if not already set
if "last_checked_at" not in data_to_save:
data_to_save["last_checked_at"] = datetime.now().timestamp()
await self._metadata_manager.save_metadata(file_path, data_to_save)
default_error = (

View File

@@ -676,10 +676,12 @@ class ModelMetadataProviderManager:
def _get_provider(self, provider_name: str = None) -> ModelMetadataProvider:
"""Get provider by name or default provider"""
if provider_name and provider_name in self.providers:
if provider_name:
if provider_name not in self.providers:
raise ValueError(f"Provider '{provider_name}' is not registered")
return self.providers[provider_name]
if self.default_provider is None:
raise ValueError("No default provider set and no valid provider specified")
return self.providers[self.default_provider]

View File

@@ -2231,3 +2231,26 @@ class RecipeScanner:
duplicate_groups = {k: v for k, v in fingerprint_groups.items() if len(v) > 1}
return duplicate_groups
async def find_duplicate_recipes_by_source(self) -> dict:
"""Find all recipe duplicates based on source_path (Civitai image URLs)
Returns:
Dictionary where keys are source URLs and values are lists of recipe IDs
"""
cache = await self.get_cached_data()
url_groups = {}
for recipe in cache.raw_data:
source_url = recipe.get('source_path', '').strip()
if not source_url:
continue
if source_url not in url_groups:
url_groups[source_url] = []
url_groups[source_url].append(recipe.get('id'))
duplicate_groups = {k: v for k, v in url_groups.items() if len(v) > 1}
return duplicate_groups

View File

@@ -138,19 +138,15 @@ def calculate_recipe_fingerprint(loras):
if not loras:
return ""
# Filter valid entries and extract hash and strength
valid_loras = []
for lora in loras:
# Skip excluded loras
if lora.get("exclude", False):
continue
# Get the hash - use modelVersionId as fallback if hash is empty
hash_value = lora.get("hash", "").lower()
if not hash_value and lora.get("isDeleted", False) and lora.get("modelVersionId"):
if not hash_value and lora.get("modelVersionId"):
hash_value = str(lora.get("modelVersionId"))
# Skip entries without a valid hash
if not hash_value:
continue

View File

@@ -1,7 +1,7 @@
[project]
name = "comfyui-lora-manager"
description = "Revolutionize your workflow with the ultimate LoRA companion for ComfyUI!"
version = "0.9.14"
version = "0.9.15"
license = {file = "LICENSE"}
dependencies = [
"aiohttp",

View File

@@ -512,6 +512,10 @@
.filter-preset.active .preset-delete-btn {
color: white;
opacity: 0;
}
.filter-preset:hover.active .preset-delete-btn {
opacity: 0.8;
}
@@ -529,13 +533,16 @@
align-items: center;
gap: 6px;
white-space: nowrap;
max-width: 120px; /* Prevent long names from breaking layout */
overflow: hidden;
text-overflow: ellipsis;
}
.preset-delete-btn {
background: none;
border: none;
color: var(--text-color);
opacity: 0.5;
opacity: 0; /* Hidden by default */
cursor: pointer;
padding: 4px;
display: flex;
@@ -546,6 +553,10 @@
margin-left: auto;
}
.filter-preset:hover .preset-delete-btn {
opacity: 0.5; /* Show on hover */
}
.preset-delete-btn:hover {
opacity: 1;
color: var(--lora-error, #e74c3c);

View File

@@ -751,12 +751,7 @@ export class FilterPresetManager {
const presetName = document.createElement('span');
presetName.className = 'preset-name';
if (isActive) {
presetName.innerHTML = `<i class="fas fa-check"></i> ${preset.name}`;
} else {
presetName.textContent = preset.name;
}
presetName.textContent = preset.name;
presetName.title = translate('header.filter.presetClickTooltip', { name: preset.name }, `Click to apply preset "${preset.name}"`);
const deleteBtn = document.createElement('button');

View File

@@ -0,0 +1,160 @@
"""Tests for checkpoint path overlap detection."""
import logging
import os
import pytest
from py import config as config_module
def _normalize(path: str) -> str:
return os.path.normpath(path).replace(os.sep, "/")
class TestCheckpointPathOverlap:
"""Test detection of overlapping paths between checkpoints and unet."""
def test_overlapping_paths_prioritizes_checkpoints(
self, monkeypatch: pytest.MonkeyPatch, tmp_path, caplog
):
"""Test that overlapping paths prioritize checkpoints for backward compatibility."""
# Create a shared physical folder
shared_dir = tmp_path / "shared_models"
shared_dir.mkdir()
# Create two symlinks pointing to the same physical folder
checkpoints_link = tmp_path / "checkpoints"
unet_link = tmp_path / "unet"
checkpoints_link.symlink_to(shared_dir, target_is_directory=True)
unet_link.symlink_to(shared_dir, target_is_directory=True)
# Create Config instance with overlapping paths
with caplog.at_level(logging.WARNING, logger=config_module.logger.name):
config = config_module.Config.__new__(config_module.Config)
config._path_mappings = {}
config._preview_root_paths = set()
config._cached_fingerprint = None
# Call the method under test
result = config._prepare_checkpoint_paths(
[str(checkpoints_link)], [str(unet_link)]
)
# Verify warning was logged
warning_messages = [
record.message
for record in caplog.records
if record.levelname == "WARNING"
and "overlapping paths" in record.message.lower()
]
assert len(warning_messages) == 1
assert "checkpoints" in warning_messages[0].lower()
assert "diffusion_models" in warning_messages[0].lower() or "unet" in warning_messages[0].lower()
# Verify warning mentions backward compatibility fallback
assert "falling back" in warning_messages[0].lower() or "backward compatibility" in warning_messages[0].lower()
# Verify only one path is returned (deduplication still works)
assert len(result) == 1
# Prioritizes checkpoints path for backward compatibility
assert _normalize(result[0]) == _normalize(str(checkpoints_link))
# Verify checkpoints_roots has the path (prioritized)
assert len(config.checkpoints_roots) == 1
assert _normalize(config.checkpoints_roots[0]) == _normalize(str(checkpoints_link))
# Verify unet_roots is empty (overlapping paths removed)
assert config.unet_roots == []
def test_non_overlapping_paths_no_warning(
self, monkeypatch: pytest.MonkeyPatch, tmp_path, caplog
):
"""Test that non-overlapping paths do not trigger a warning."""
# Create separate physical folders
checkpoints_dir = tmp_path / "checkpoints"
checkpoints_dir.mkdir()
unet_dir = tmp_path / "unet"
unet_dir.mkdir()
# Create Config instance with separate paths
with caplog.at_level(logging.WARNING, logger=config_module.logger.name):
config = config_module.Config.__new__(config_module.Config)
config._path_mappings = {}
config._preview_root_paths = set()
config._cached_fingerprint = None
result = config._prepare_checkpoint_paths(
[str(checkpoints_dir)], [str(unet_dir)]
)
# Verify no overlapping paths warning was logged
warning_messages = [
record.message
for record in caplog.records
if record.levelname == "WARNING"
and "overlapping paths" in record.message.lower()
]
assert len(warning_messages) == 0
# Verify both paths are returned
assert len(result) == 2
normalized_result = [_normalize(p) for p in result]
assert _normalize(str(checkpoints_dir)) in normalized_result
assert _normalize(str(unet_dir)) in normalized_result
# Verify both roots are properly set
assert len(config.checkpoints_roots) == 1
assert len(config.unet_roots) == 1
def test_partial_overlap_prioritizes_checkpoints(
self, monkeypatch: pytest.MonkeyPatch, tmp_path, caplog
):
"""Test partial overlap - overlapping paths prioritize checkpoints."""
# Create folders
shared_dir = tmp_path / "shared"
shared_dir.mkdir()
separate_checkpoint = tmp_path / "separate_ckpt"
separate_checkpoint.mkdir()
separate_unet = tmp_path / "separate_unet"
separate_unet.mkdir()
# Create symlinks - one shared, others separate
shared_link = tmp_path / "shared_link"
shared_link.symlink_to(shared_dir, target_is_directory=True)
with caplog.at_level(logging.WARNING, logger=config_module.logger.name):
config = config_module.Config.__new__(config_module.Config)
config._path_mappings = {}
config._preview_root_paths = set()
config._cached_fingerprint = None
# One checkpoint path overlaps with one unet path
result = config._prepare_checkpoint_paths(
[str(shared_link), str(separate_checkpoint)],
[str(shared_link), str(separate_unet)]
)
# Verify warning was logged for the overlapping path
warning_messages = [
record.message
for record in caplog.records
if record.levelname == "WARNING"
and "overlapping paths" in record.message.lower()
]
assert len(warning_messages) == 1
# Verify 3 unique paths (shared counted once as checkpoint, plus separate ones)
assert len(result) == 3
# Verify the overlapping path appears in warning message
assert str(shared_link.name) in warning_messages[0] or str(shared_dir.name) in warning_messages[0]
# Verify checkpoints_roots includes both checkpoint paths (including the shared one)
assert len(config.checkpoints_roots) == 2
checkpoint_normalized = [_normalize(p) for p in config.checkpoints_roots]
assert _normalize(str(shared_link)) in checkpoint_normalized
assert _normalize(str(separate_checkpoint)) in checkpoint_normalized
# Verify unet_roots only includes the non-overlapping unet path
assert len(config.unet_roots) == 1
assert _normalize(config.unet_roots[0]) == _normalize(str(separate_unet))

View File

@@ -298,6 +298,134 @@ def test_deep_symlink_not_scanned(monkeypatch: pytest.MonkeyPatch, tmp_path):
assert normalized_external not in cfg._path_mappings
def test_deep_symlink_discovered_on_preview_access(monkeypatch: pytest.MonkeyPatch, tmp_path):
"""Deep symlinks are discovered dynamically when preview is accessed."""
loras_dir, settings_dir = _setup_paths(monkeypatch, tmp_path)
# Create nested structure with deep symlink at second level
subdir = loras_dir / "anime"
subdir.mkdir()
external_dir = tmp_path / "external"
external_dir.mkdir()
deep_symlink = subdir / "styles"
deep_symlink.symlink_to(external_dir, target_is_directory=True)
# Create preview file under deep symlink
preview_file = deep_symlink / "model.preview.jpeg"
preview_file.write_bytes(b"preview")
# Config should not initially detect deep symlinks
cfg = config_module.Config()
normalized_external = _normalize(str(external_dir))
normalized_deep_link = _normalize(str(deep_symlink))
assert normalized_external not in cfg._path_mappings
# First preview access triggers symlink discovery automatically and returns True
is_allowed = cfg.is_preview_path_allowed(str(preview_file))
# After discovery, preview should be allowed
assert is_allowed
assert normalized_external in cfg._path_mappings
assert cfg._path_mappings[normalized_external] == normalized_deep_link
# Verify preview path is now allowed without triggering discovery again
assert cfg.is_preview_path_allowed(str(preview_file))
def test_deep_symlink_at_third_level(monkeypatch: pytest.MonkeyPatch, tmp_path):
"""Deep symlinks at third level are also discovered dynamically."""
loras_dir, settings_dir = _setup_paths(monkeypatch, tmp_path)
# Create nested structure with deep symlink at third level
level1 = loras_dir / "category"
level1.mkdir()
level2 = level1 / "subcategory"
level2.mkdir()
external_dir = tmp_path / "external_deep"
external_dir.mkdir()
deep_symlink = level2 / "deep"
deep_symlink.symlink_to(external_dir, target_is_directory=True)
# Create preview file under deep symlink
preview_file = deep_symlink / "preview.webp"
preview_file.write_bytes(b"test")
cfg = config_module.Config()
# First preview access triggers symlink discovery at third level
is_allowed = cfg.is_preview_path_allowed(str(preview_file))
assert is_allowed
normalized_external = _normalize(str(external_dir))
normalized_deep_link = _normalize(str(deep_symlink))
assert normalized_external in cfg._path_mappings
assert cfg._path_mappings[normalized_external] == normalized_deep_link
def test_deep_symlink_points_outside_roots(monkeypatch: pytest.MonkeyPatch, tmp_path):
"""Deep symlinks can point to locations outside configured roots."""
loras_dir, settings_dir = _setup_paths(monkeypatch, tmp_path)
# Create nested structure with deep symlink pointing outside roots
subdir = loras_dir / "shared"
subdir.mkdir()
outside_root = tmp_path / "storage"
outside_root.mkdir()
deep_symlink = subdir / "models"
deep_symlink.symlink_to(outside_root, target_is_directory=True)
# Create preview file under deep symlink (outside original roots)
preview_file = deep_symlink / "external.png"
preview_file.write_bytes(b"external")
cfg = config_module.Config()
# Preview access triggers symlink discovery
is_allowed = cfg.is_preview_path_allowed(str(preview_file))
# After discovery, preview should be allowed even though target is outside roots
assert is_allowed
normalized_outside = _normalize(str(outside_root))
assert normalized_outside in cfg._path_mappings
def test_normal_path_unaffected_by_discovery(monkeypatch: pytest.MonkeyPatch, tmp_path):
"""Normal paths (no symlinks) are not affected by symlink discovery logic."""
loras_dir, settings_dir = _setup_paths(monkeypatch, tmp_path)
# Create normal file structure (no symlinks)
preview_file = loras_dir / "normal.preview.jpeg"
preview_file.write_bytes(b"normal")
cfg = config_module.Config()
# Normal paths work without any discovery
assert cfg.is_preview_path_allowed(str(preview_file))
assert len(cfg._path_mappings) == 0
def test_first_level_symlink_still_works(monkeypatch: pytest.MonkeyPatch, tmp_path):
"""First-level symlinks continue to work as before."""
loras_dir, settings_dir = _setup_paths(monkeypatch, tmp_path)
# Create first-level symlink
external_dir = tmp_path / "first_level_external"
external_dir.mkdir()
first_symlink = loras_dir / "first_level"
first_symlink.symlink_to(external_dir, target_is_directory=True)
# Create preview file under first-level symlink
preview_file = first_symlink / "model.png"
preview_file.write_bytes(b"first_level")
cfg = config_module.Config()
# First-level symlinks are scanned during initialization
normalized_external = _normalize(str(external_dir))
assert normalized_external in cfg._path_mappings
assert cfg.is_preview_path_allowed(str(preview_file))
def test_legacy_symlink_cache_automatic_cleanup(monkeypatch: pytest.MonkeyPatch, tmp_path):
"""Test that legacy symlink cache is automatically cleaned up after migration."""
settings_dir = tmp_path / "settings"

View File

@@ -188,3 +188,91 @@ def test_is_preview_path_allowed_rejects_prefix_without_separator(tmp_path):
# The sibling path should NOT be allowed even though it shares a prefix
assert not config.is_preview_path_allowed(str(sibling_file)), \
f"Path in '{sibling_root}' should NOT be allowed when root is '{library_root}'"
async def test_preview_handler_serves_from_deep_symlink(tmp_path):
"""Test that previews under deep symlinks are served correctly."""
library_root = tmp_path / "library"
library_root.mkdir()
# Create nested structure with deep symlink at second level
subdir = library_root / "anime"
subdir.mkdir()
external_dir = tmp_path / "external"
external_dir.mkdir()
deep_symlink = subdir / "styles"
deep_symlink.symlink_to(external_dir, target_is_directory=True)
# Create preview file under deep symlink
preview_file = deep_symlink / "model.preview.webp"
preview_file.write_bytes(b"preview_content")
config = Config()
config.apply_library_settings(
{
"folder_paths": {
"loras": [str(library_root)],
"checkpoints": [],
"unet": [],
"embeddings": [],
}
}
)
handler = PreviewHandler(config=config)
encoded_path = urllib.parse.quote(str(preview_file), safe="")
request = make_mocked_request("GET", f"/api/lm/previews?path={encoded_path}")
response = await handler.serve_preview(request)
assert isinstance(response, web.FileResponse)
assert response.status == 200
assert Path(response._path) == preview_file.resolve()
async def test_deep_symlink_discovered_on_first_access(tmp_path):
"""Test that deep symlinks are discovered on first preview access."""
library_root = tmp_path / "library"
library_root.mkdir()
# Create nested structure with deep symlink at second level
subdir = library_root / "category"
subdir.mkdir()
external_dir = tmp_path / "storage"
external_dir.mkdir()
deep_symlink = subdir / "models"
deep_symlink.symlink_to(external_dir, target_is_directory=True)
# Create preview file under deep symlink
preview_file = deep_symlink / "test.png"
preview_file.write_bytes(b"test_image")
config = Config()
config.apply_library_settings(
{
"folder_paths": {
"loras": [str(library_root)],
"checkpoints": [],
"unet": [],
"embeddings": [],
}
}
)
# Deep symlink should not be in mappings initially
normalized_external = os.path.normpath(str(external_dir)).replace(os.sep, '/')
assert normalized_external not in config._path_mappings
handler = PreviewHandler(config=config)
encoded_path = urllib.parse.quote(str(preview_file), safe="")
request = make_mocked_request("GET", f"/api/lm/previews?path={encoded_path}")
# First access should trigger symlink discovery and serve the preview
response = await handler.serve_preview(request)
assert isinstance(response, web.FileResponse)
assert response.status == 200
assert Path(response._path) == preview_file.resolve()
# Deep symlink should now be in mappings
assert normalized_external in config._path_mappings

View File

@@ -0,0 +1,110 @@
"""Test for duplicate detection by source URL."""
import pytest
from unittest.mock import AsyncMock, MagicMock
@pytest.mark.asyncio
async def test_find_duplicate_recipes_by_source():
"""Test that duplicate recipes are detected by source URL."""
from py.services.recipe_scanner import RecipeScanner
scanner = MagicMock(spec=RecipeScanner)
scanner.get_cached_data = AsyncMock()
cache = MagicMock()
cache.raw_data = [
{
'id': '8705c972-ef08-47f3-8ac3-9ac3b8ff4c0b',
'source_path': 'https://civitai.com/images/119165946',
'title': 'Recipe 1'
},
{
'id': '52e636ce-ea9f-4f64-a6a9-c704bd715889',
'source_path': 'https://civitai.com/images/119165946',
'title': 'Recipe 2'
},
{
'id': '00000000-0000-0000-0000-000000000001',
'source_path': 'https://civitai.com/images/999999999',
'title': 'Recipe 3'
},
{
'id': '00000000-0000-0000-0000-000000000002',
'source_path': '',
'title': 'Recipe 4 (no source)'
},
]
scanner.get_cached_data.return_value = cache
# Call the actual method on the mocked scanner
from py.services.recipe_scanner import RecipeScanner as RealRecipeScanner
result = await RealRecipeScanner.find_duplicate_recipes_by_source(scanner)
assert len(result) == 1
assert 'https://civitai.com/images/119165946' in result
assert len(result['https://civitai.com/images/119165946']) == 2
assert '8705c972-ef08-47f3-8ac3-9ac3b8ff4c0b' in result['https://civitai.com/images/119165946']
assert '52e636ce-ea9f-4f64-a6a9-c704bd715889' in result['https://civitai.com/images/119165946']
@pytest.mark.asyncio
async def test_find_duplicate_recipes_by_source_empty():
"""Test that empty result is returned when no duplicates found."""
from py.services.recipe_scanner import RecipeScanner
scanner = MagicMock(spec=RecipeScanner)
scanner.get_cached_data = AsyncMock()
cache = MagicMock()
cache.raw_data = [
{
'id': '8705c972-ef08-47f3-8ac3-9ac3b8ff4c0b',
'source_path': 'https://civitai.com/images/119165946',
'title': 'Recipe 1'
},
{
'id': '00000000-0000-0000-0000-000000000002',
'source_path': '',
'title': 'Recipe 2 (no source)'
},
]
scanner.get_cached_data.return_value = cache
from py.services.recipe_scanner import RecipeScanner as RealRecipeScanner
result = await RealRecipeScanner.find_duplicate_recipes_by_source(scanner)
assert len(result) == 0
@pytest.mark.asyncio
async def test_find_duplicate_recipes_by_source_trimming_whitespace():
"""Test that whitespace is trimmed from source URLs."""
from py.services.recipe_scanner import RecipeScanner
scanner = MagicMock(spec=RecipeScanner)
scanner.get_cached_data = AsyncMock()
cache = MagicMock()
cache.raw_data = [
{
'id': '8705c972-ef08-47f3-8ac3-9ac3b8ff4c0b',
'source_path': 'https://civitai.com/images/119165946',
'title': 'Recipe 1'
},
{
'id': '52e636ce-ea9f-4f64-a6a9-c704bd715889',
'source_path': ' https://civitai.com/images/119165946 ',
'title': 'Recipe 2'
},
]
scanner.get_cached_data.return_value = cache
from py.services.recipe_scanner import RecipeScanner as RealRecipeScanner
result = await RealRecipeScanner.find_duplicate_recipes_by_source(scanner)
assert len(result) == 1
assert 'https://civitai.com/images/119165946' in result
assert len(result['https://civitai.com/images/119165946']) == 2

View File

@@ -482,6 +482,81 @@ async def test_relink_metadata_raises_when_version_missing():
model_version_id=None,
)
@pytest.mark.asyncio
async def test_fetch_and_update_model_persists_db_checked_when_sqlite_fails(tmp_path):
"""
Regression test: When a deleted model is checked against sqlite and not found,
db_checked=True must be persisted to disk so the model is skipped in future refreshes.
Previously, db_checked was set in memory but never saved because the save_metadata
call was inside the `if civitai_api_not_found:` block, which is False for deleted
models (since the default CivitAI API is never tried).
"""
default_provider = SimpleNamespace(
get_model_by_hash=AsyncMock(),
get_model_version=AsyncMock(),
)
civarchive_provider = SimpleNamespace(
get_model_by_hash=AsyncMock(return_value=(None, "Model not found")),
get_model_version=AsyncMock(),
)
sqlite_provider = SimpleNamespace(
get_model_by_hash=AsyncMock(return_value=(None, "Model not found")),
get_model_version=AsyncMock(),
)
async def select_provider(name: str):
if name == "civarchive_api":
return civarchive_provider
if name == "sqlite":
return sqlite_provider
return default_provider
provider_selector = AsyncMock(side_effect=select_provider)
helpers = build_service(
settings_values={"enable_metadata_archive_db": True},
default_provider=default_provider,
provider_selector=provider_selector,
)
model_path = tmp_path / "model.safetensors"
model_data = {
"civitai_deleted": True,
"db_checked": False,
"from_civitai": False,
"file_path": str(model_path),
"model_name": "Deleted Model",
}
update_cache = AsyncMock()
ok, error = await helpers.service.fetch_and_update_model(
sha256="deadbeef",
file_path=str(model_path),
model_data=model_data,
update_cache_func=update_cache,
)
# The call should fail because neither provider found metadata
assert not ok
assert error is not None
assert "Model not found" in error or "not found in metadata archive DB" in error
# Both providers should have been tried
assert civarchive_provider.get_model_by_hash.await_count == 1
assert sqlite_provider.get_model_by_hash.await_count == 1
# db_checked should be True in memory
assert model_data["db_checked"] is True
# CRITICAL: metadata should have been saved to disk with db_checked=True
helpers.metadata_manager.save_metadata.assert_awaited_once()
saved_call = helpers.metadata_manager.save_metadata.await_args
saved_data = saved_call.args[1]
assert saved_data["db_checked"] is True
assert "folder" not in saved_data # folder should be stripped
assert "last_checked_at" in saved_data # timestamp should be set
@pytest.mark.asyncio
async def test_fetch_and_update_model_does_not_overwrite_api_metadata_with_archive(tmp_path):
helpers = build_service()

View File

@@ -0,0 +1,100 @@
"""Test for modelVersionId fallback in fingerprint calculation."""
import pytest
from py.utils.utils import calculate_recipe_fingerprint
def test_calculate_fingerprint_with_model_version_id_fallback():
"""Test that fingerprint uses modelVersionId when hash is empty, even when not deleted."""
loras = [
{
"hash": "",
"strength": 1.0,
"modelVersionId": 2639467,
"isDeleted": False,
"exclude": False
}
]
fingerprint = calculate_recipe_fingerprint(loras)
assert fingerprint == "2639467:1.0"
def test_calculate_fingerprint_with_multiple_model_version_ids():
"""Test fingerprint with multiple loras using modelVersionId fallback."""
loras = [
{
"hash": "",
"strength": 1.0,
"modelVersionId": 2639467,
"isDeleted": False,
"exclude": False
},
{
"hash": "",
"strength": 0.8,
"modelVersionId": 1234567,
"isDeleted": False,
"exclude": False
}
]
fingerprint = calculate_recipe_fingerprint(loras)
assert fingerprint == "1234567:0.8|2639467:1.0"
def test_calculate_fingerprint_with_deleted_lora():
"""Test that deleted loras with modelVersionId are still included."""
loras = [
{
"hash": "",
"strength": 1.0,
"modelVersionId": 2639467,
"isDeleted": True,
"exclude": False
}
]
fingerprint = calculate_recipe_fingerprint(loras)
assert fingerprint == "2639467:1.0"
def test_calculate_fingerprint_with_excluded_lora():
"""Test that excluded loras are skipped even with modelVersionId."""
loras = [
{
"hash": "",
"strength": 1.0,
"modelVersionId": 2639467,
"isDeleted": False,
"exclude": True
}
]
fingerprint = calculate_recipe_fingerprint(loras)
assert fingerprint == ""
def test_calculate_fingerprint_prefers_hash_over_version_id():
"""Test that hash is used even when modelVersionId is present."""
loras = [
{
"hash": "abc123",
"strength": 1.0,
"modelVersionId": 2639467,
"isDeleted": False,
"exclude": False
}
]
fingerprint = calculate_recipe_fingerprint(loras)
assert fingerprint == "abc123:1.0"
def test_calculate_fingerprint_without_hash_or_version_id():
"""Test that loras without hash or modelVersionId are skipped."""
loras = [
{
"hash": "",
"strength": 1.0,
"modelVersionId": 0,
"isDeleted": False,
"exclude": False
}
]
fingerprint = calculate_recipe_fingerprint(loras)
assert fingerprint == ""

View File

@@ -502,7 +502,7 @@ const onRepeatBlur = (event: Event) => {
/* Repeat Controls */
.repeat-input {
width: 40px;
width: 50px;
height: 32px;
padding: 0 6px;
background: rgba(26, 32, 44, 0.9);

View File

@@ -1464,16 +1464,16 @@ to { transform: rotate(360deg);
box-sizing: border-box;
}
.cycler-settings[data-v-c4d1cba7] {
.cycler-settings[data-v-5b16b9d3] {
display: flex;
flex-direction: column;
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, sans-serif;
color: #e4e4e7;
}
.settings-header[data-v-c4d1cba7] {
.settings-header[data-v-5b16b9d3] {
margin-bottom: 8px;
}
.settings-title[data-v-c4d1cba7] {
.settings-title[data-v-5b16b9d3] {
font-size: 10px;
font-weight: 600;
letter-spacing: 0.05em;
@@ -1482,10 +1482,10 @@ to { transform: rotate(360deg);
margin: 0;
text-transform: uppercase;
}
.setting-section[data-v-c4d1cba7] {
.setting-section[data-v-5b16b9d3] {
margin-bottom: 8px;
}
.setting-label[data-v-c4d1cba7] {
.setting-label[data-v-5b16b9d3] {
font-size: 13px;
font-weight: 500;
color: rgba(226, 232, 240, 0.8);
@@ -1494,10 +1494,10 @@ to { transform: rotate(360deg);
}
/* Progress Display */
.progress-section[data-v-c4d1cba7] {
.progress-section[data-v-5b16b9d3] {
margin-bottom: 12px;
}
.progress-display[data-v-c4d1cba7] {
.progress-display[data-v-5b16b9d3] {
background: rgba(26, 32, 44, 0.9);
border: 1px solid rgba(226, 232, 240, 0.2);
border-radius: 6px;
@@ -1507,31 +1507,31 @@ to { transform: rotate(360deg);
align-items: center;
transition: border-color 0.3s ease;
}
.progress-display.executing[data-v-c4d1cba7] {
.progress-display.executing[data-v-5b16b9d3] {
border-color: rgba(66, 153, 225, 0.5);
animation: pulse-c4d1cba7 2s ease-in-out infinite;
animation: pulse-5b16b9d3 2s ease-in-out infinite;
}
@keyframes pulse-c4d1cba7 {
@keyframes pulse-5b16b9d3 {
0%, 100% { border-color: rgba(66, 153, 225, 0.3);
}
50% { border-color: rgba(66, 153, 225, 0.7);
}
}
.progress-info[data-v-c4d1cba7] {
.progress-info[data-v-5b16b9d3] {
display: flex;
flex-direction: column;
gap: 2px;
min-width: 0;
flex: 1;
}
.progress-label[data-v-c4d1cba7] {
.progress-label[data-v-5b16b9d3] {
font-size: 10px;
font-weight: 500;
color: rgba(226, 232, 240, 0.5);
text-transform: uppercase;
letter-spacing: 0.03em;
}
.progress-name[data-v-c4d1cba7] {
.progress-name[data-v-5b16b9d3] {
font-size: 13px;
font-weight: 500;
color: rgba(191, 219, 254, 1);
@@ -1539,7 +1539,7 @@ to { transform: rotate(360deg);
text-overflow: ellipsis;
white-space: nowrap;
}
.progress-name.clickable[data-v-c4d1cba7] {
.progress-name.clickable[data-v-5b16b9d3] {
cursor: pointer;
padding: 2px 6px;
margin: -2px -6px;
@@ -1549,34 +1549,34 @@ to { transform: rotate(360deg);
align-items: center;
gap: 4px;
}
.progress-name.clickable[data-v-c4d1cba7]:hover:not(.disabled) {
.progress-name.clickable[data-v-5b16b9d3]:hover:not(.disabled) {
background: rgba(66, 153, 225, 0.2);
color: rgba(191, 219, 254, 1);
}
.progress-name.clickable.disabled[data-v-c4d1cba7] {
.progress-name.clickable.disabled[data-v-5b16b9d3] {
cursor: not-allowed;
opacity: 0.5;
}
.progress-info.disabled[data-v-c4d1cba7] {
.progress-info.disabled[data-v-5b16b9d3] {
cursor: not-allowed;
}
.selector-icon[data-v-c4d1cba7] {
.selector-icon[data-v-5b16b9d3] {
width: 16px;
height: 16px;
opacity: 0.5;
flex-shrink: 0;
}
.progress-name.clickable:hover .selector-icon[data-v-c4d1cba7] {
.progress-name.clickable:hover .selector-icon[data-v-5b16b9d3] {
opacity: 0.8;
}
.progress-counter[data-v-c4d1cba7] {
.progress-counter[data-v-5b16b9d3] {
display: flex;
align-items: center;
gap: 4px;
padding-left: 12px;
flex-shrink: 0;
}
.progress-index[data-v-c4d1cba7] {
.progress-index[data-v-5b16b9d3] {
font-size: 18px;
font-weight: 600;
color: rgba(66, 153, 225, 1);
@@ -1585,12 +1585,12 @@ to { transform: rotate(360deg);
text-align: right;
font-variant-numeric: tabular-nums;
}
.progress-separator[data-v-c4d1cba7] {
.progress-separator[data-v-5b16b9d3] {
font-size: 14px;
color: rgba(226, 232, 240, 0.4);
margin: 0 2px;
}
.progress-total[data-v-c4d1cba7] {
.progress-total[data-v-5b16b9d3] {
font-size: 14px;
font-weight: 500;
color: rgba(226, 232, 240, 0.6);
@@ -1601,7 +1601,7 @@ to { transform: rotate(360deg);
}
/* Repeat Progress */
.repeat-progress[data-v-c4d1cba7] {
.repeat-progress[data-v-5b16b9d3] {
display: flex;
align-items: center;
gap: 6px;
@@ -1611,23 +1611,23 @@ to { transform: rotate(360deg);
border: 1px solid rgba(226, 232, 240, 0.1);
border-radius: 4px;
}
.repeat-progress-track[data-v-c4d1cba7] {
.repeat-progress-track[data-v-5b16b9d3] {
width: 32px;
height: 4px;
background: rgba(226, 232, 240, 0.15);
border-radius: 2px;
overflow: hidden;
}
.repeat-progress-fill[data-v-c4d1cba7] {
.repeat-progress-fill[data-v-5b16b9d3] {
height: 100%;
background: linear-gradient(90deg, #f59e0b, #fbbf24);
border-radius: 2px;
transition: width 0.3s ease;
}
.repeat-progress-fill.is-complete[data-v-c4d1cba7] {
.repeat-progress-fill.is-complete[data-v-5b16b9d3] {
background: linear-gradient(90deg, #10b981, #34d399);
}
.repeat-progress-text[data-v-c4d1cba7] {
.repeat-progress-text[data-v-5b16b9d3] {
font-size: 10px;
font-family: 'SF Mono', 'Roboto Mono', monospace;
color: rgba(253, 230, 138, 0.9);
@@ -1636,19 +1636,19 @@ to { transform: rotate(360deg);
}
/* Index Controls Row - Grouped Layout */
.index-controls-row[data-v-c4d1cba7] {
.index-controls-row[data-v-5b16b9d3] {
display: flex;
align-items: flex-end;
gap: 16px;
}
/* Control Group */
.control-group[data-v-c4d1cba7] {
.control-group[data-v-5b16b9d3] {
display: flex;
flex-direction: column;
gap: 6px;
}
.control-group-label[data-v-c4d1cba7] {
.control-group-label[data-v-5b16b9d3] {
font-size: 11px;
font-weight: 500;
color: rgba(226, 232, 240, 0.5);
@@ -1656,13 +1656,13 @@ to { transform: rotate(360deg);
letter-spacing: 0.03em;
line-height: 1;
}
.control-group-content[data-v-c4d1cba7] {
.control-group-content[data-v-5b16b9d3] {
display: flex;
align-items: baseline;
gap: 4px;
height: 32px;
}
.index-input[data-v-c4d1cba7] {
.index-input[data-v-5b16b9d3] {
width: 50px;
height: 32px;
padding: 0 8px;
@@ -1675,15 +1675,15 @@ to { transform: rotate(360deg);
line-height: 32px;
box-sizing: border-box;
}
.index-input[data-v-c4d1cba7]:focus {
.index-input[data-v-5b16b9d3]:focus {
outline: none;
border-color: rgba(66, 153, 225, 0.6);
}
.index-input[data-v-c4d1cba7]:disabled {
.index-input[data-v-5b16b9d3]:disabled {
opacity: 0.4;
cursor: not-allowed;
}
.index-hint[data-v-c4d1cba7] {
.index-hint[data-v-5b16b9d3] {
font-size: 12px;
color: rgba(226, 232, 240, 0.4);
font-variant-numeric: tabular-nums;
@@ -1691,8 +1691,8 @@ to { transform: rotate(360deg);
}
/* Repeat Controls */
.repeat-input[data-v-c4d1cba7] {
width: 40px;
.repeat-input[data-v-5b16b9d3] {
width: 50px;
height: 32px;
padding: 0 6px;
background: rgba(26, 32, 44, 0.9);
@@ -1705,11 +1705,11 @@ to { transform: rotate(360deg);
line-height: 32px;
box-sizing: border-box;
}
.repeat-input[data-v-c4d1cba7]:focus {
.repeat-input[data-v-5b16b9d3]:focus {
outline: none;
border-color: rgba(66, 153, 225, 0.6);
}
.repeat-suffix[data-v-c4d1cba7] {
.repeat-suffix[data-v-5b16b9d3] {
font-size: 13px;
color: rgba(226, 232, 240, 0.4);
font-weight: 500;
@@ -1717,7 +1717,7 @@ to { transform: rotate(360deg);
}
/* Action Buttons */
.action-buttons[data-v-c4d1cba7] {
.action-buttons[data-v-5b16b9d3] {
display: flex;
align-items: center;
gap: 6px;
@@ -1725,7 +1725,7 @@ to { transform: rotate(360deg);
}
/* Control Buttons */
.control-btn[data-v-c4d1cba7] {
.control-btn[data-v-5b16b9d3] {
display: flex;
align-items: center;
justify-content: center;
@@ -1739,52 +1739,52 @@ to { transform: rotate(360deg);
cursor: pointer;
transition: all 0.2s;
}
.control-btn[data-v-c4d1cba7]:hover:not(:disabled) {
.control-btn[data-v-5b16b9d3]:hover:not(:disabled) {
background: rgba(66, 153, 225, 0.2);
border-color: rgba(66, 153, 225, 0.4);
color: rgba(191, 219, 254, 1);
}
.control-btn[data-v-c4d1cba7]:disabled {
.control-btn[data-v-5b16b9d3]:disabled {
opacity: 0.4;
cursor: not-allowed;
}
.control-btn.active[data-v-c4d1cba7] {
.control-btn.active[data-v-5b16b9d3] {
background: rgba(245, 158, 11, 0.2);
border-color: rgba(245, 158, 11, 0.5);
color: rgba(253, 230, 138, 1);
}
.control-btn.active[data-v-c4d1cba7]:hover {
.control-btn.active[data-v-5b16b9d3]:hover {
background: rgba(245, 158, 11, 0.3);
border-color: rgba(245, 158, 11, 0.6);
}
.control-icon[data-v-c4d1cba7] {
.control-icon[data-v-5b16b9d3] {
width: 14px;
height: 14px;
}
/* Slider Container */
.slider-container[data-v-c4d1cba7] {
.slider-container[data-v-5b16b9d3] {
background: rgba(26, 32, 44, 0.9);
border: 1px solid rgba(226, 232, 240, 0.2);
border-radius: 6px;
padding: 6px;
}
.slider-container--disabled[data-v-c4d1cba7] {
.slider-container--disabled[data-v-5b16b9d3] {
opacity: 0.5;
pointer-events: none;
}
.section-header-with-toggle[data-v-c4d1cba7] {
.section-header-with-toggle[data-v-5b16b9d3] {
display: flex;
align-items: center;
justify-content: space-between;
margin-bottom: 8px;
}
.section-header-with-toggle .setting-label[data-v-c4d1cba7] {
.section-header-with-toggle .setting-label[data-v-5b16b9d3] {
margin-bottom: 4px;
}
/* Toggle Switch */
.toggle-switch[data-v-c4d1cba7] {
.toggle-switch[data-v-5b16b9d3] {
position: relative;
width: 36px;
height: 20px;
@@ -1793,7 +1793,7 @@ to { transform: rotate(360deg);
border: none;
cursor: pointer;
}
.toggle-switch__track[data-v-c4d1cba7] {
.toggle-switch__track[data-v-5b16b9d3] {
position: absolute;
inset: 0;
background: var(--comfy-input-bg, #333);
@@ -1801,11 +1801,11 @@ to { transform: rotate(360deg);
border-radius: 10px;
transition: all 0.2s;
}
.toggle-switch--active .toggle-switch__track[data-v-c4d1cba7] {
.toggle-switch--active .toggle-switch__track[data-v-5b16b9d3] {
background: rgba(66, 153, 225, 0.3);
border-color: rgba(66, 153, 225, 0.6);
}
.toggle-switch__thumb[data-v-c4d1cba7] {
.toggle-switch__thumb[data-v-5b16b9d3] {
position: absolute;
top: 3px;
left: 2px;
@@ -1816,12 +1816,12 @@ to { transform: rotate(360deg);
transition: all 0.2s;
opacity: 0.6;
}
.toggle-switch--active .toggle-switch__thumb[data-v-c4d1cba7] {
.toggle-switch--active .toggle-switch__thumb[data-v-5b16b9d3] {
transform: translateX(16px);
background: #4299e1;
opacity: 1;
}
.toggle-switch:hover .toggle-switch__thumb[data-v-c4d1cba7] {
.toggle-switch:hover .toggle-switch__thumb[data-v-5b16b9d3] {
opacity: 1;
}
@@ -13226,7 +13226,7 @@ const _sfc_main$4 = /* @__PURE__ */ defineComponent({
};
}
});
const LoraCyclerSettingsView = /* @__PURE__ */ _export_sfc(_sfc_main$4, [["__scopeId", "data-v-c4d1cba7"]]);
const LoraCyclerSettingsView = /* @__PURE__ */ _export_sfc(_sfc_main$4, [["__scopeId", "data-v-5b16b9d3"]]);
const _hoisted_1$3 = { class: "search-container" };
const _hoisted_2$2 = { class: "lora-list" };
const _hoisted_3$1 = ["onMouseenter", "onClick"];

File diff suppressed because one or more lines are too long