Compare commits

..

102 Commits

Author SHA1 Message Date
Will Miao
f09224152a feat: bump version to 0.9.11 2025-11-29 17:46:06 +08:00
Will Miao
df93670598 feat: add checkpoint metadata to EXIF recipe data
Add support for storing checkpoint information in image EXIF metadata. The checkpoint data is simplified and includes fields like model ID, version, name, hash, and base model. This allows for better tracking of AI model checkpoints used in image generation workflows.
2025-11-29 08:46:38 +08:00
Will Miao
073fb3a94a feat(recipe-parser): enhance LoRA metadata with local file matching
Add comprehensive local file matching for LoRA entries in recipe metadata:
- Add modelVersionId-based lookup via new _get_lora_from_version_index method
- Extend LoRA entry with additional fields: existsLocally, inLibrary, localPath, thumbnailUrl, size
- Improve local file detection by checking both SHA256 hash and modelVersionId
- Set default thumbnail URL and size values for missing LoRA files
- Add proper typing with Optional imports for better code clarity

This provides more accurate local file status and metadata for LoRA entries in recipes.
2025-11-29 08:29:05 +08:00
Will Miao
53c4165d82 feat(parser): enhance model metadata extraction in Automatic1111 parser
- Add MODEL_NAME_PATTERN regex to extract model names from parameters
- Extract model hash from parsed hashes when available in metadata
- Add checkpoint model hash and name extraction from parameters section
- Implement checkpoint resource processing from Civitai metadata
- Improve model information completeness for better recipe tracking
2025-11-29 08:13:55 +08:00
Will Miao
8cd4550189 feat: add Flux.2 D and ZImageTurbo model constants
Add new model constants for Flux.2 D and ZImageTurbo to the BASE_MODELS object,
along with their corresponding abbreviations in BASE_MODEL_ABBREVIATIONS. Also
include these new models in the appropriate categories within BASE_MODEL_CATEGORIES.

This update ensures the application can properly recognize and handle these
newly supported AI models in the system.
2025-11-28 11:42:46 +08:00
Will Miao
2b2e4fefab feat(tests): restructure test HTML to nest elements under model modal
Refactor the test HTML structure to properly nest all model metadata elements within the model modal container. This improves test accuracy by matching the actual DOM structure used in the application, ensuring that element selection and event handling work correctly during testing.
2025-11-27 20:44:05 +08:00
Will Miao
5f93648297 feat: scope DOM queries to modal element in ModelMetadata
Refactor updateModalFilePathReferences function to scope all DOM queries within the modal element. This prevents potential conflicts with other elements on the page that might have the same CSS selectors. Added helper functions scopedQuery and scopedQueryAll to limit element selection to the modal context, improving reliability and preventing unintended side effects.
2025-11-27 20:33:04 +08:00
pixelpaws
8a628f0bd0 Merge pull request #703 from willmiao/fix/showcase-listener-leaks
fix(showcase): tear down modal listeners
2025-11-27 20:09:45 +08:00
Will Miao
b67c8598d6 feat(metadata): clear stale cache entries when metadata is empty
Update metadata registry to remove cache entries when node metadata becomes empty instead of keeping stale data. This prevents accumulation of unused cache entries and ensures cache only contains valid metadata. Added test case to verify cache behavior when LoRA configurations are removed.
2025-11-27 20:04:38 +08:00
Will Miao
0254c9d0e9 fix(showcase): tear down modal listeners 2025-11-27 18:00:59 +08:00
Will Miao
ecb512995c feat(civitai): expand image metadata detection criteria, see #700
Add additional CivitAI image metadata fields to detection logic including generation parameters (prompt, steps, sampler, etc.) and model information. Also improve LoRA hash detection by checking both main metadata and nested meta objects. This ensures more comprehensive identification of CivitAI image metadata across different response formats.
2025-11-27 10:28:04 +08:00
Will Miao
f8b9fa9b20 fix(civitai): improve metadata parsing for nested structures, see #700
- Refactor metadata detection to handle nested "meta" objects
- Add support for lowercase "lora:" hash keys
- Extract metadata from nested "meta" field when present
- Update tests to verify nested metadata parsing
- Handle case-insensitive LORA hash detection

The changes ensure proper parsing of Civitai image metadata that may be wrapped in nested structures, improving compatibility with different API response formats.
2025-11-26 13:46:08 +08:00
Will Miao
5d4917c8d9 feat: add v0.9.10 release notes with new features and improvements
- Implement smarter update matching with base model grouping options
- Add flexible tag filtering with include/exclude functionality
- Display license icons and add license filtering controls
- Improve recipes with zero-LoRA imports and checkpoint references
- Enhance ZIP downloads with automatic model extraction
- Update template workflow with improved guidance
- Include various bug fixes and stability improvements
2025-11-24 11:15:05 +08:00
Will Miao
a50309c22e feat: update template workflow and image assets 2025-11-24 10:22:12 +08:00
Will Miao
f5020e081f feat(autocomplete): restrict embeddings autocomplete to explicit prefix
Only trigger autocomplete for embeddings when the current token starts with "emb:" prefix. This prevents interrupting normal prompt typing while maintaining quick manual access to embeddings suggestions.
2025-11-22 20:55:20 +08:00
Will Miao
3c0bfcb226 feat: add KSampler_inspire node extractor for comfyui-inspire-pack, fixes #693 2025-11-22 14:28:44 +08:00
Will Miao
9198a23ba9 feat: normalize and validate checkpoint entries before enrichment
Add _normalize_checkpoint_entry method to handle legacy checkpoint data formats (strings, tuples) by converting them to dictionaries. This prevents errors during enrichment when checkpoint data is not in the expected dictionary format. Invalid checkpoint entries are now removed instead of causing processing failures.

- Update get_paginated_data and get_recipe_by_id methods to use normalization
- Add test cases for legacy string and tuple checkpoint formats
- Ensure backward compatibility with existing checkpoint handling
2025-11-21 23:36:32 +08:00
Will Miao
02bac7edfb feat: normalize and validate checkpoint entries in recipes
Add _normalize_checkpoint_entry method to handle legacy and malformed checkpoint data by:
- Converting string entries to structured dict format
- Handling single-element lists/tuples recursively
- Dropping invalid entries with appropriate warnings
- Maintaining backward compatibility while improving data consistency

Add test case to verify string checkpoint conversion works correctly.
2025-11-21 23:00:02 +08:00
Will Miao
ea1d1a49c9 feat: enhance search with include/exclude tokens and improved sorting
- Add token parsing to support include/exclude search terms using "-" prefix
- Implement token-based matching logic for relative path searches
- Improve search result sorting by prioritizing prefix matches and match position
- Add frontend test for multi-token highlighting with exclusion support
2025-11-21 19:48:43 +08:00
Will Miao
9a789f8f08 feat: add checkpoint hash filtering and navigation
- Add checkpoint hash parameter parsing to backend routes
- Implement checkpoint hash filtering in frontend API client
- Add click navigation from recipe modal to checkpoints page
- Update checkpoint items to use pointer cursor for better UX

Checkpoint items in recipe modal are now clickable and will navigate to the checkpoints page with appropriate hash filtering applied. This improves user workflow when wanting to view checkpoint details from recipes.
2025-11-21 16:17:01 +08:00
Will Miao
1971881537 feat: add checkpoint scanner integration to recipe scanner
- Add CheckpointScanner dependency to RecipeScanner singleton
- Implement checkpoint enrichment in recipe data processing
- Add _enrich_checkpoint_entry method to enhance checkpoint metadata
- Update recipe formatting to include checkpoint information
- Extend get_instance, __new__, and __init__ methods to support checkpoint scanner
- Add _get_checkpoint_from_version_index method for cache lookup

This enables recipe scanner to handle checkpoint models alongside existing LoRA support, providing complete model metadata for recipes.
2025-11-21 15:36:54 +08:00
Will Miao
4eb46a8d3e feat: consolidate checkpoint metadata handling
- Extract checkpoint entry from multiple metadata locations using helper method
- Sanitize checkpoint metadata by removing transient/local-only fields
- Remove checkpoint duplication from generation parameters to store only at top level
- Update frontend to properly populate checkpoint metadata during import
- Add tests for new checkpoint handling functionality

This ensures consistent checkpoint metadata structure and prevents data duplication across different storage locations.
2025-11-21 14:55:45 +08:00
Will Miao
36f28b3c65 feat: normalize LoRA preview URLs for browser accessibility
Add _normalize_preview_url method to ensure preview URLs are properly formatted for browser access. The method handles absolute paths by converting them to static URLs via config.get_preview_static_url, while preserving API paths and other valid URLs. This ensures consistent preview image display across different URL formats.

Update _enrich_lora_entry to apply URL normalization to preview URLs obtained from both hash-based lookups and version entries. Add comprehensive test coverage for absolute path normalization scenarios.
2025-11-21 12:31:23 +08:00
Will Miao
2452cc4df1 feat(recipes): resolve base model from checkpoint metadata
Add metadata service integration to automatically resolve base model information from checkpoint metadata during recipe import. This replaces the previous approach of relying solely on request parameters and provides more accurate base model information.

- Add _resolve_base_model_from_checkpoint method to fetch base model from metadata provider
- Update recipe import logic to use resolved base model when available
- Add comprehensive tests for base model resolution with fallback behavior
- Remove debug print statement from import parameters
2025-11-21 12:12:27 +08:00
Will Miao
eda1ce9743 feat: improve base model display with abbreviations in RecipeCard
- Import getBaseModelAbbreviation utility function
- Add fallback handling for missing base model values
- Display abbreviated base model names while keeping full name in tooltip
- Maintain "Unknown" label for recipes without base model specification
- Improve user experience by showing cleaner, more readable model identifiers
2025-11-21 11:36:17 +08:00
Will Miao
e24621a0af feat(recipe-scanner): add version index fallback for LoRA enrichment
Add _get_lora_from_version_index method to fetch cached LoRA entries by modelVersionId when hash is unavailable. This improves LoRA enrichment by using version index as fallback when hash is missing, ensuring proper library status, file paths, and preview URLs are set even without hash values.

Update test suite to include version_index in stub cache and add test coverage for version-based lookup functionality.
2025-11-21 11:27:09 +08:00
Will Miao
7173a2b9d6 feat: add remote recipe import functionality
Add support for importing recipes from remote sources by:
- Adding import_remote_recipe endpoint to RecipeHandlerSet
- Injecting downloader_factory and civitai_client_getter dependencies
- Implementing image download and resource parsing logic
- Supporting Civitai resource payloads with checkpoints and LoRAs
- Adding required imports for regex and temporary file handling

This enables users to import recipes directly from external sources like Civitai without manual file downloads.
2025-11-21 11:12:58 +08:00
pixelpaws
d540b21aac Merge pull request #691 from willmiao/feat/zip-preview
feat(downloads): support safetensors zips and previews
2025-11-20 19:56:31 +08:00
Will Miao
9952721e76 feat(downloads): support safetensors zips and previews 2025-11-20 19:41:31 +08:00
Will Miao
26e4895807 feat(auto-organize): improve exclusion handling and progress reporting
- Add auto_organize_exclusions to settings handler proxy keys
- Refactor model file service to handle exclusions relative to model roots
- Improve auto-organize progress reporting for empty operations
- Fix exclusion pattern matching to consider relative paths within model roots
- Ensure proper validation when no model roots are configured
- Add comprehensive cleanup reporting for empty auto-organize operations
2025-11-20 18:33:48 +08:00
Will Miao
c533a8e7bf feat: enhance Civitai metadata handling and image URL processing
- Import rewrite_preview_url utility for optimized image URL handling
- Update thumbnail URL processing for both LoRA and checkpoint entries to use rewritten URLs
- Expand checkpoint metadata with modelId, file size, SHA256 hash, and file name
- Improve error handling and data validation for Civitai API responses
- Maintain backward compatibility with existing data structures
2025-11-20 16:31:48 +08:00
pixelpaws
dc820a456f Merge pull request #690 from willmiao/codex/add-auto-organize-exclusions-field
Add auto-organize exclusion settings
2025-11-20 16:24:29 +08:00
pixelpaws
07721af87c feat(settings): add auto-organize exclusions 2025-11-20 16:08:32 +08:00
Will Miao
5093c30c06 feat: add video support to model version delete preview
- Extend CSS to style video elements in delete previews
- Add video rendering logic for model version previews
- Use consistent placeholder image for missing previews
- Maintain existing image preview functionality while adding video support

This allows users to see video previews when deleting model versions, improving the user experience for video-based models.
2025-11-19 22:42:58 +08:00
Will Miao
8c77080ae6 feat: conditionally hide license filters on recipes page
Add shouldShowLicenseFilters method to check if current page is 'recipes' and skip license filter initialization and updates when on recipes page. Also conditionally render license filter section in header template based on current page.

This prevents license filters from appearing on the recipes page where they are not applicable.
2025-11-19 22:26:16 +08:00
pixelpaws
bcf72c6bcc Merge pull request #689 from willmiao/civitai-deletion-logic
feat(metadata): improve civitai deletion detection logic, see #670
2025-11-19 19:27:28 +08:00
Will Miao
3849f7eef9 feat(metadata): improve civitai deletion detection logic
- Track when Civitai API returns "Model not found" for default provider
- Use dedicated flag instead of error string comparison for deletion detection
- Ensure archive-sourced models don't get marked as deleted
- Add test coverage for archive source deletion flag behavior
- Fix deletion flag logic to properly handle provider fallback scenarios
2025-11-19 19:16:40 +08:00
pixelpaws
7eced1e3e9 Merge pull request #686 from willmiao/fix/model-extension-delete-rename
fix(model): preserve original extension on rename
2025-11-19 11:43:01 +08:00
Will Miao
51b5261f40 fix(model): align rename extension detection 2025-11-19 11:20:09 +08:00
Will Miao
963f6b1383 fix(model): preserve original extension on rename 2025-11-19 11:08:08 +08:00
Will Miao
b75baa1d1a fix: support GGUF model deletion in model lifecycle service
- Add optional main_extension parameter to delete_model_artifacts function
- Extract file extension from model filename to handle different file types
- Update model scanner to pass file extension when deleting models
- Add test case for GGUF file deletion to ensure proper cleanup
- Maintain backward compatibility with existing safetensors models

This change allows the model lifecycle service to properly delete GGUF model files along with their associated metadata and preview files, expanding support beyond just safetensors format.
2025-11-19 10:36:03 +08:00
Will Miao
6d95e93378 feat: simplify model ID parsing and loading manager usage 2025-11-19 10:26:07 +08:00
pixelpaws
7117e0c33e Merge pull request #684 from willmiao/codex/add-check-update-to-single-model-context-menu
Add single-model update checks to context menus
2025-11-19 00:09:59 +08:00
pixelpaws
d261474f3a feat(context-menu): add single model update checks 2025-11-19 00:01:50 +08:00
pixelpaws
c09d67d2e4 Merge pull request #683 from willmiao/deletion-sync, see #673
feat(model-lifecycle): integrate model update service for deletion sync
2025-11-18 23:28:17 +08:00
Will Miao
1427dc8e38 feat(model-lifecycle): integrate model update service for deletion sync
Add ModelUpdateService dependency to ModelLifecycleService to enable synchronization during model deletion. The service is now passed through BaseModelRoutes initialization and used in delete_model to trigger updates when a model is removed. This ensures external systems stay in sync with local model state changes.

Key changes:
- Inject update_service into ModelLifecycleService constructor
- Extract model ID from metadata during deletion
- Call update service sync method after successful deletion
- Add proper type hints and TYPE_CHECKING imports
2025-11-18 21:02:39 +08:00
pixelpaws
77a7b90dc7 Merge pull request #682 from willmiao/feature/model-type-filter
Feature/model type filter
2025-11-18 18:51:52 +08:00
Will Miao
e9d55fe146 feat(filters): add model type filter 2025-11-18 16:43:44 +08:00
Will Miao
57f369a6de feat(model): add model type filtering support
- Add model_types parameter to ModelListingHandler to support filtering by model type
- Implement get_model_types endpoint in ModelQueryHandler to retrieve available model types
- Register new /api/lm/{prefix}/model-types route for model type queries
- Extend BaseModelService to handle model type filtering in queries
- Support both model_type and civitai_model_type query parameters for backward compatibility

This enables users to filter models by specific types, improving model discovery and organization capabilities.
2025-11-18 15:36:01 +08:00
Will Miao
059ebeead7 feat: include Negative file type in primary file selection for embeddings 2025-11-18 14:16:22 +08:00
Will Miao
831a9da9d7 feat: update version badge logic for same-base update strategy, see #676
- Remove unused isNewer variable calculation
- Use dividerThresholdVersionId instead of latestLibraryVersionId for badge logic
- Add test case to verify newer version badge appears with same-base strategy
- Ensures correct badge display when filtering by same base model versions
2025-11-18 11:18:32 +08:00
Will Miao
6000e08640 feat(i18n): add new license restriction translations
Add four new license restriction keys to all locale files:
- noImageSell: "No selling generated content"
- noRentCivit: "No Civitai generation"
- noRent: "No generation services"
- noSell: "No selling models"

These additions provide comprehensive coverage for various commercial and generation restrictions in the licensing system, ensuring proper localization across all supported languages.
2025-11-18 09:17:04 +08:00
pixelpaws
3edc65c106 Merge pull request #681 from willmiao/update-strategy, see #676
Add update flag strategy
2025-11-18 08:44:46 +08:00
Will Miao
655157434e feat(versions): add base filter toggle UI and styling
Add CSS classes and JavaScript logic for the base filter toggle button in the versions toolbar. The filter allows users to switch between showing all versions or only versions matching the current base model. Includes styling for different states (active, hover, disabled) and accessibility features like screen reader support.
2025-11-18 06:47:07 +08:00
Will Miao
3661b11b70 feat(i18n): add update flag strategy settings
Add new "updateFlags" section to settings navigation and implement update flag strategy configuration. The strategy allows users to choose when update badges appear:
- Match updates by base model (only show when new release shares same base model)
- Flag any available update (show whenever newer version exists)

Includes translations for English, German, Spanish, and French locales.
2025-11-17 20:02:26 +08:00
Will Miao
0e73db0669 feat: implement same_base update strategy for model annotations
Add support for configurable update flag strategy with new "same_base" mode that considers base model versions when determining update availability. The strategy is controlled by the "update_flag_strategy" setting.

When strategy is set to "same_base":
- Uses get_records_bulk instead of has_updates_bulk
- Compares model versions against highest local versions per base model
- Provides more granular update detection based on base model relationships

Fallback to existing bulk or individual update checks when:
- Strategy is not "same_base"
- Bulk operations fail
- Records are unavailable

This enables more precise update flagging for models sharing common bases.
2025-11-17 19:26:41 +08:00
Will Miao
8158441a92 feat: add CheckpointLoaderKJ extractor and improve model filename handling, fixes #666
- Add CheckpointLoaderKJ to NODE_EXTRACTORS mapping for KJNodes support
- Enhance model filename generation in SaveImage to handle different data types
- Add proper type checking and fallback for model metadata values
- Improve robustness when processing checkpoint paths for filename generation
2025-11-17 08:52:51 +08:00
pixelpaws
5600471093 Merge pull request #675 from willmiao/fix/portable-mode-sync
fix(settings): sync portable mode toggle
2025-11-16 17:52:01 +08:00
Will Miao
354cf03bbc fix(settings): sync portable mode toggle 2025-11-16 17:36:52 +08:00
Will Miao
645b7c247d feat(i18n): increase trigger word length limit from 30 to 100 words
Update trigger word validation message across all language files to reflect increased character limit. The change allows users to create longer trigger words, providing more flexibility in trigger word creation while maintaining the existing maximum count of 30 trigger words.
2025-11-15 22:22:42 +08:00
Will Miao
5f25a29303 Revert "修复:在应用LoRA值到文本时仅包含激活的LoRA", see #669
This reverts commit 1cdbb9a851.
2025-11-15 16:26:31 +08:00
Will Miao
906d00106d feat(trigger-words): increase maximum word limit from 30 to 100, fixes #660 2025-11-15 08:19:53 +08:00
Will Miao
7850131969 feat: add metadata extractor for KJNodes model loaders, see #666
Add KJNodesModelLoaderExtractor to handle metadata extraction from KJNodes loaders that expose model_name. This supports GGUFLoaderKJ and DiffusionModelLoaderKJ nodes, ensuring consistent checkpoint metadata collection across different node types.
2025-11-14 15:46:11 +08:00
pixelpaws
3d5ec4a9f1 Merge pull request #668 from Aaalice233/main
修复:在应用LoRA值到文本时仅包含激活的LoRA
2025-11-14 15:19:32 +08:00
Luna_K
1cdbb9a851 修复:在应用LoRA值到文本时仅包含激活的LoRA
- 在applyLoraValuesToText函数中添加激活状态检查
- 如果LoRA的active属性为false,则跳过该LoRA
- 保持向后兼容性:当active属性未定义或为null时,默认视为激活状态
- 确保只有用户选中的LoRA会被应用到工作流文本中
2025-11-14 13:53:10 +08:00
pixelpaws
e224be4b88 Merge pull request #664 from willmiao/codex/remove-recipevalidationerror-on-empty-lora_matches
Allow widget recipe saves without LoRA matches
2025-11-13 16:22:57 +08:00
pixelpaws
b9d3a4afce Merge pull request #665 from willmiao/codex/refactor-imageprocessor-to-normalize-loras-array
fix: allow importing recipes without loras
2025-11-13 16:22:40 +08:00
pixelpaws
aa4aa1a613 fix(import): allow zero lora recipes 2025-11-13 15:53:54 +08:00
pixelpaws
cc8e1c5049 fix(recipes): allow widget save without lora matches 2025-11-13 15:52:31 +08:00
pixelpaws
41e649415a Merge pull request #658 from willmiao/feature/global-license-refresh
Feature/global license refresh
2025-11-11 14:54:37 +08:00
Will Miao
c8f770a86b feat: batch process model license data retrieval 2025-11-11 14:36:19 +08:00
Will Miao
29bb85359e feat(context-menu): refresh missing license metadata 2025-11-11 14:24:59 +08:00
Will Miao
4557da8b63 feat(metadata): return tuple with metadata and success flag
Change `load_metadata` method to return a tuple containing both the metadata object and a boolean success flag instead of just the metadata object. This provides clearer error handling and allows callers to distinguish between successful loads with null metadata versus actual load failures.
2025-11-11 11:18:33 +08:00
pixelpaws
09b75de25b Merge pull request #656 from willmiao/feat/hash-chunk-size-config
feat(settings): add configurable hash chunk size
2025-11-10 10:18:00 +08:00
Will Miao
415fc5720c feat(settings): add configurable hash chunk size 2025-11-10 10:15:01 +08:00
Will Miao
4dd8ce778e feat(trigger): add optional strength adjustment for trigger words
Add `allow_strength_adjustment` parameter to enable mouse wheel adjustment of trigger word strengths. When enabled, strength values are preserved and can be modified interactively. Also improves trigger word parsing by handling whitespace more consistently and adding debug logging for trigger data inspection.
2025-11-09 22:24:23 +08:00
Will Miao
f81ff2efe9 feat: remove strength-based styling from tags widget
Remove visual styling for tags with modified strength values. The gold border and gradient background were previously applied to tags with strength values other than 1.0, but this visual distinction is no longer needed. This simplifies the tag styling logic and maintains consistent appearance across all tags regardless of their strength values.
2025-11-09 18:02:57 +08:00
Will Miao
837bb17b08 feat(comfyui): fix trigger word toggle widget initialization
Change from loadedGraphNode to nodeCreated lifecycle method to ensure proper widget initialization timing. Wrap widget creation and highlight logic in requestAnimationFrame to prevent race conditions with node setup. This ensures the trigger word toggle widget functions correctly when nodes are created.
2025-11-08 19:39:25 +08:00
Will Miao
5ee93a27ee feat: add license flags display to model preview tooltip #613
- Add optional license_flags parameter to model preview API endpoint
- Include license flags in response when requested via query parameter
- Add CSS styles for license overlay and icons in tooltip
- Implement license flag parsing and icon mapping logic
- Display license restrictions as icons in preview tooltip overlay

This allows users to see model license restrictions directly in the preview tooltip without needing to navigate to detailed model information pages.
2025-11-08 19:09:06 +08:00
Will Miao
2e6aa5fe9f feat: replace nodeCreated with loadedGraphNode for LoraManager nodes
- Change lifecycle hook from nodeCreated to loadedGraphNode in Lora Loader, Lora Stacker, and TriggerWord Toggle nodes
- Remove requestAnimationFrame wrappers as loadedGraphNode ensures proper initialization timing
- Maintain same functionality for restoring saved values and widget initialization
- Improves reliability by using the appropriate node lifecycle event
2025-11-08 14:08:43 +08:00
pixelpaws
c14e066f8f Merge pull request #651 from willmiao/tag-filtering-with-include-exclude-states, see #622
feat: implement tag filtering with include/exclude states
2025-11-08 12:01:13 +08:00
Will Miao
c09100c22e feat: implement tag filtering with include/exclude states
- Update frontend tag filter to cycle through include/exclude/clear states
- Add backend support for tag_include and tag_exclude query parameters
- Maintain backward compatibility with legacy tag parameter
- Store tag states as dictionary with 'include'/'exclude' values
- Update test matrix documentation to reflect new tag behavior

The changes enable more granular tag filtering where users can now explicitly include or exclude specific tags, rather than just adding tags to a simple inclusion list. This provides better control over search results and improves the filtering user experience.
2025-11-08 11:45:31 +08:00
pixelpaws
839ed3bda3 Merge pull request #650 from willmiao/license-filter, see #548 and #613
License filter
2025-11-08 10:30:32 +08:00
Will Miao
1f627774c1 feat(i18n): add license and content usage filter labels
Add new translation keys for model filter interface:
- license
- noCreditRequired
- allowSellingGeneratedContent

These labels support new filtering options for model licensing and content usage permissions, enabling users to filter models based on their license requirements and commercial usage rights.
2025-11-08 10:20:28 +08:00
Will Miao
3b842355c2 feat: add license-based filtering for model listings
Add support for filtering models by license requirements:
- credit_required: filter models that require credits or allow free use
- allow_selling_generated_content: filter models based on commercial usage rights

These filters use license_flags bitmask to determine model permissions and enable users to find models that match their specific usage requirements and budget constraints.
2025-11-07 22:28:29 +08:00
Will Miao
dd27411ebf feat(trigger-word-toggle): add strength value support for trigger words
- Extract and preserve strength values from trigger words in format "(word:strength)"
- Maintain strength formatting when filtering active trigger words in both group and individual modes
- Update active state tracking to handle strength-modified words correctly
- Ensure backward compatibility with existing trigger word formats
2025-11-07 16:38:04 +08:00
Will Miao
388ff7f5b4 feat(ui): add trigger word highlighting for selected LoRAs
- Import applySelectionHighlight in lora_loader and lora_stacker
- Pass onSelectionChange callback to loras_widget to handle selection changes
- Implement selection tracking and payload building in loras_widget
- Emit selection changes when LoRA selection is modified
- Update tags_widget to support highlighted tag styling

This provides visual feedback when LoRAs are selected by highlighting associated trigger words in the interface.
2025-11-07 16:08:56 +08:00
Will Miao
f76343f389 feat(lora): add mode change listeners to update trigger words
Add property descriptor to listen for mode changes in Lora Loader and Lora Stacker nodes. When node mode changes, automatically update connected trigger word toggle nodes and downstream loader nodes to maintain synchronization between node modes and trigger word states.

- Lora Loader: Updates connected trigger words when mode changes
- Lora Stacker: Updates connected trigger words and downstream loaders when mode changes
- Both nodes log mode changes for debugging purposes
2025-11-07 15:11:59 +08:00
Will Miao
ce5a1ae3d0 feat(lora-stacker): conditionally update trigger words based on node mode
Add node mode checks to ensure trigger words are only updated when the stacker node is active (mode 0 for Always or mode 3 for On Trigger). This prevents unnecessary updates when the node is inactive (mode 2 for Never or mode 4 for Bypass), improving performance and ensuring trigger words reflect the actual active state of the node.

The changes include:
- Adding mode checks before updating active LoRA names in the stacker callback
- Modifying collectActiveLorasFromChain to only include active nodes
- Adding comments to clarify node mode behavior
2025-11-07 14:21:58 +08:00
pixelpaws
1d40d7400f Merge pull request #648 from willmiao/fix-rate-limit-retry, see #647
feat(metadata): add rate limit retry support to metadata providers
2025-11-07 10:57:48 +08:00
Will Miao
1bb5d0b072 feat(metadata): add rate limit retry support to metadata providers
Add RateLimitRetryingProvider and _RateLimitRetryHelper classes to handle rate limiting with exponential backoff retries. Update get_metadata_provider function to automatically wrap providers with rate limit handling. This improves reliability when external APIs return rate limit errors by implementing automatic retries with configurable delays and jitter.
2025-11-07 09:18:59 +08:00
Will Miao
c3932538e1 feat: add git reset and clean before nightly and release updates, fixes #646
Add hard reset and clean operations to ensure a clean working directory
before switching branches or checking out release tags. This prevents
local changes from interfering with the update process and ensures
consistent behavior across both nightly and release update paths.
2025-11-07 08:17:20 +08:00
Will Miao
a68141adf4 feat(i18n): add license restriction translations for multiple languages
Add license-related translation keys including credit requirements, derivative restrictions, and license sharing permissions. This supports displaying proper license information and restrictions in the UI across all supported languages (DE, EN, ES, FR, HE, JA, KO, RU).
2025-11-06 23:01:29 +08:00
Will Miao
fb8ba4c076 feat: update commercial icon configuration order 2025-11-06 22:55:08 +08:00
Will Miao
4ed3bd9039 feat: refactor model hash lookup to improve error handling and code clarity
- Simplify error handling logic by checking for "not found" message directly
- Extract model data fetching into separate _fetch_model_data method
- Extract version enrichment into separate _enrich_version_with_model_data method
- Improve logging consistency using %s formatting
- Rename variables for better clarity (result -> version, e -> exc)
2025-11-06 22:41:50 +08:00
Will Miao
ba6e2eadba feat: update license flag handling and default permissions
- Update DEFAULT_LICENSE_FLAGS from 57 to 127 to enable all commercial modes by default
- Replace CommercialUseLevel enum with bitwise commercial permission handling
- Simplify commercial value normalization and validation using allowed values set
- Adjust bit shifting in license flag construction to accommodate new commercial bits structure
- Remove CommercialUseLevel from exports and update tests accordingly
- Improve handling of empty commercial use values with proper type checking

The changes streamline commercial permission processing and align with CivitAI's default license configuration while maintaining backward compatibility.
2025-11-06 22:14:36 +08:00
Will Miao
1c16392367 feat: improve license restriction labels for clarity
Update license restriction labels in ModelModal component to be more descriptive and user-friendly. Changed fallback text and translation keys for various license restrictions including:
- Selling models
- Generation services
- Civitai generation
- Selling generated content
- Creator credit requirements
- Sharing merges
- Permission requirements

The changes make the license restrictions more clear and specific about what actions are prohibited or required.
2025-11-06 21:31:28 +08:00
Will Miao
035ad4b473 feat(ui): increase border opacity and adjust color for better visibility
Update the --lora-border CSS custom property to use a darker, more opaque color. The previous border color was too subtle and lacked sufficient contrast against the background. This change improves visual hierarchy and makes interface elements more distinguishable.
2025-11-06 21:17:29 +08:00
Will Miao
a7ee883227 feat(modal): add license restriction indicators to model modal
Add visual indicators for commercial license restrictions in the model modal. New CSS classes and JavaScript utilities handle the display of restriction icons for selling, renting, and image usage limitations. The modal header actions container has been restructured to accommodate the new license restriction section.

- Add `.modal-header-actions` and `.license-restrictions` CSS classes
- Implement commercial license icon configuration and rendering logic
- Normalize and sanitize commercial restriction values
- Update header layout to remove bottom margin for better visual alignment
2025-11-06 21:04:59 +08:00
Will Miao
ddf9e33961 feat: add license information handling for Civitai models
Add license resolution utilities and integrate license information into model metadata processing. The changes include:

- Add `resolve_license_payload` function to extract license data from Civitai model responses
- Integrate license information into model metadata in CivitaiClient and MetadataSyncService
- Add license flags support in model scanning and caching
- Implement CommercialUseLevel enum for standardized license classification
- Update model scanner to handle unknown fields when extracting metadata values

This ensures proper license attribution and compliance when working with Civitai models.
2025-11-06 17:05:54 +08:00
Will Miao
4301b3455f feat(civarchive_client): remove HTML scraping implementation and bs4 dependency
Remove legacy HTML scraping implementation of get_model_by_url method
and associated BeautifulSoup dependency. The functionality has been
replaced by API-based implementation in get_model_version method.

This simplifies the codebase and removes the optional bs4 dependency,
making the client more maintainable and reliable.
2025-11-05 22:31:39 +08:00
Will Miao
3d6bb432c4 feat: normalize tags to lowercase for Windows compatibility, see #637
Convert all tags to lowercase in tag processing logic to prevent case sensitivity issues on Windows filesystems. This ensures consistent tag matching and prevents duplicate tags with different cases from being created.

Changes include:
- TagUpdateService now converts tags to lowercase before comparison
- Utils function converts model tags to lowercase before priority resolution
- Test cases updated to reflect lowercase tag expectations
2025-11-04 12:54:09 +08:00
151 changed files with 10152 additions and 1161 deletions

View File

@@ -34,6 +34,15 @@ Enhance your Civitai browsing experience with our companion browser extension! S
## Release Notes
### v0.9.10
* **Smarter Update Matching** - Users can now choose to check and group updates by matching base model only or with no base-model constraint; version lists also support toggling between same-base versions or all versions.
* **Flexible Tag Filtering** - The filter panel now supports tag exclusion: click a tag to include, click again to exclude, and click a third time to clear, enabling stronger and more flexible tag filters.
* **License Visibility & Controls** - Model detail headers and ComfyUI preview popups now show Civitai license icons. The filter panel gains license include/exclude options, and a new global context menu action, "Refresh license metadata," fetches missing license data.
* **Recipe Improvements** - Recipes now allow importing with zero LoRAs, and recipe detail pages show the related checkpoint for easier reference.
* **Better ZIP Downloads** - When downloading models packaged in ZIPs, model files are extracted into the target model folder; ZIPs containing multiple model files (e.g., WanVideo high/low LoRA pairs) are added as separate models.
* **Template Workflow Update** - Refreshed the "Illustrious Pony Example" template workflow with usage guidance for each LoRA Manager node.
* **Bug Fixes & Stability** - General fixes and stability improvements.
### v0.9.9
* **Check for Updates Feature** - Users can now check for updates for all models or selected models in bulk mode. Models with available updates will display an "update available" badge on their model card, and users can filter to show only models with updates.
* **Model Versions Management** - Added a new Versions tab in the model modal that centralizes all versions of a model, providing download, delete, and ignore update functions.
@@ -71,34 +80,6 @@ Enhance your Civitai browsing experience with our companion browser extension! S
* **Automatic Filename Conflict Resolution** - Implemented automatic file renaming (`original name + short hash`) to prevent conflicts when downloading or moving models.
* **Performance Optimizations & Bug Fixes** - Various performance improvements and bug fixes for a more stable and responsive experience.
### v0.8.30
* **Automatic Model Path Correction** - Added auto-correction for model paths in built-in nodes such as Load Checkpoint, Load Diffusion Model, Load LoRA, and other custom nodes with similar functionality. Workflows containing outdated or incorrect model paths will now be automatically updated to reflect the current location of your models.
* **Node UI Enhancements** - Improved node interface for a smoother and more intuitive user experience.
* **Bug Fixes** - Addressed various bugs to enhance stability and reliability.
### v0.8.29
* **Enhanced Recipe Imports** - Improved recipe importing with new target folder selection, featuring path input autocomplete and interactive folder tree navigation. Added a "Use Default Path" option when downloading missing LoRAs.
* **WanVideo Lora Select Node Update** - Updated the WanVideo Lora Select node with a 'merge_loras' option to match the counterpart node in the WanVideoWrapper node package.
* **Autocomplete Conflict Resolution** - Resolved an autocomplete feature conflict in LoRA nodes with pysssss autocomplete.
* **Improved Download Functionality** - Enhanced download functionality with resumable downloads and improved error handling.
* **Bug Fixes** - Addressed several bugs for improved stability and performance.
### v0.8.28
* **Autocomplete for Node Inputs** - Instantly find and add LoRAs by filename directly in Lora Loader, Lora Stacker, and WanVideo Lora Select nodes. Autocomplete suggestions include preview tooltips and preset weights, allowing you to quickly select LoRAs without opening the LoRA Manager UI.
* **Duplicate Notification Control** - Added a switch to duplicates mode, enabling users to turn off duplicate model notifications for a more streamlined experience.
* **Download Example Images from Context Menu** - Introduced a new context menu option to download example images for individual models.
### v0.8.27
* **User Experience Enhancements** - Improved the model download target folder selection with path input autocomplete and interactive folder tree navigation, making it easier and faster to choose where models are saved.
* **Default Path Option for Downloads** - Added a "Use Default Path" option when downloading models. When enabled, models are automatically organized and stored according to your configured path template settings.
* **Advanced Download Path Templates** - Expanded path template settings, allowing users to set individual templates for LoRA, checkpoint, and embedding models for greater flexibility. Introduced the `{author}` placeholder, enabling automatic organization of model files by creator name.
* **Bug Fixes & Stability Improvements** - Addressed various bugs and improved overall stability for a smoother experience.
### v0.8.26
* **Creator Search Option** - Added ability to search models by creator name, making it easier to find models from specific authors.
* **Enhanced Node Usability** - Improved user experience for Lora Loader, Lora Stacker, and WanVideo Lora Select nodes by fixing the maximum height of the text input area. Users can now freely and conveniently adjust the LoRA region within these nodes.
* **Compatibility Fixes** - Resolved compatibility issues with ComfyUI and certain custom nodes, including ComfyUI-Custom-Scripts, ensuring smoother integration and operation.
[View Update History](./update_logs.md)
---

View File

@@ -21,7 +21,7 @@ This matrix captures the scenarios that Phase 3 frontend tests should cover for
| ID | Feature | Scenario | LoRAs Expectations | Checkpoints Expectations | Notes |
| --- | --- | --- | --- | --- | --- |
| F-01 | Search filter | Typing a query updates `pageState.filters.search`, persists to session, and triggers `resetAndReload` on submit | Validate `SearchManager` writes query and reloads via API stub; confirm LoRA cards pass query downstream | Same as LoRAs | Cover `enter` press and clicking search icon |
| F-02 | Tag filter | Selecting a tag chip adds it to filters, applies active styling, and reloads results | Tag stored under `filters.tags`; `FilterManager.applyFilters` persists and triggers `resetAndReload(true)` | Same; ensure base model tag set is scoped to checkpoints dataset | Include removal path |
| F-02 | Tag filter | Selecting a tag chip cycles include ➜ exclude ➜ clear, updates storage, and reloads results | Tag state stored under `filters.tags[tagName] = 'include'|'exclude'`; `FilterManager.applyFilters` persists and triggers `resetAndReload(true)` | Same; ensure base model tag set is scoped to checkpoints dataset | Include removal path |
| F-03 | Base model filter | Toggling base model checkboxes updates `filters.baseModel`, persists, and reloads | Ensure only LoRA-supported models show; toggle multi-select | Ensure SDXL/Flux base models appear as expected | Capture UI state restored from storage on next init |
| F-04 | Favorites-only | Clicking favorites toggle updates session flag and calls `resetAndReload(true)` | Button gains `.active` class and API called | Same | Verify duplicates badge refresh when active |
| F-05 | Sort selection | Changing sort select saves preference (legacy + new format) and reloads | Confirm `PageControls.saveSortPreference` invoked with option and API called | Same with checkpoints-specific defaults | Cover `convertLegacySortFormat` branch |

Binary file not shown.

Before

Width:  |  Height:  |  Size: 669 KiB

File diff suppressed because one or more lines are too long

Binary file not shown.

Before

Width:  |  Height:  |  Size: 669 KiB

After

Width:  |  Height:  |  Size: 668 KiB

File diff suppressed because one or more lines are too long

View File

@@ -152,6 +152,13 @@
"none": "Keine Beispielbild-Ordner mussten bereinigt werden",
"partial": "Bereinigung abgeschlossen, {failures} Ordner übersprungen",
"error": "Fehler beim Bereinigen der Beispielbild-Ordner: {message}"
},
"fetchMissingLicenses": {
"label": "Refresh license metadata",
"loading": "Refreshing license metadata for {typePlural}...",
"success": "Updated license metadata for {count} {typePlural}",
"none": "All {typePlural} already have license metadata",
"error": "Failed to refresh license metadata for {typePlural}: {message}"
}
},
"header": {
@@ -188,6 +195,10 @@
"title": "Modelle filtern",
"baseModel": "Basis-Modell",
"modelTags": "Tags (Top 20)",
"modelTypes": "Model Types",
"license": "Lizenz",
"noCreditRequired": "Kein Credit erforderlich",
"allowSellingGeneratedContent": "Verkauf erlaubt",
"clearAll": "Alle Filter löschen"
},
"theme": {
@@ -220,10 +231,17 @@
"priorityTags": "Prioritäts-Tags",
"downloadPathTemplates": "Download-Pfad-Vorlagen",
"exampleImages": "Beispielbilder",
"updateFlags": "Update-Markierungen",
"autoOrganize": "Auto-organize",
"misc": "Verschiedenes",
"metadataArchive": "Metadaten-Archiv-Datenbank",
"storageLocation": "Einstellungsort",
"proxySettings": "Proxy-Einstellungen"
},
"storage": {
"locationLabel": "Portabler Modus",
"locationHelp": "Aktiviere, um settings.json im Repository zu belassen; deaktiviere, um es im Benutzerkonfigurationsordner zu speichern."
},
"contentFiltering": {
"blurNsfwContent": "NSFW-Inhalte unscharf stellen",
"blurNsfwContentHelp": "Nicht jugendfreie (NSFW) Vorschaubilder unscharf stellen",
@@ -234,6 +252,15 @@
"autoplayOnHover": "Videos bei Hover automatisch abspielen",
"autoplayOnHoverHelp": "Video-Vorschauen nur beim Darüberfahren mit der Maus abspielen"
},
"autoOrganizeExclusions": {
"label": "Auto-Organisierungs-Ausnahmen",
"placeholder": "Beispiel: curated/*, */backups/*; *_temp.safetensors",
"help": "Dateien überspringen, die mit diesen Wildcard-Mustern übereinstimmen. Mehrere Muster mit Kommas oder Semikolons trennen.",
"validation": {
"noPatterns": "Geben Sie mindestens ein Muster ein, getrennt durch Kommas oder Semikolons.",
"saveFailed": "Fehler beim Speichern der Ausschlüsse: {message}"
}
},
"layoutSettings": {
"displayDensity": "Anzeige-Dichte",
"displayDensityOptions": {
@@ -256,7 +283,6 @@
"hover": "Bei Hover anzeigen"
},
"cardInfoDisplayHelp": "Wählen Sie, wann Modellinformationen und Aktionsschaltflächen angezeigt werden sollen",
"modelCardFooterAction": "Aktion der Modellkarten-Schaltfläche",
"modelCardFooterActionOptions": {
"exampleImages": "Beispielbilder öffnen",
@@ -350,6 +376,14 @@
"download": "Herunterladen",
"restartRequired": "Neustart erforderlich"
},
"updateFlagStrategy": {
"label": "Strategie für Update-Markierungen",
"help": "Entscheide, ob Update-Badges nur dann erscheinen, wenn eine neue Version dasselbe Basismodell wie deine lokalen Dateien verwendet, oder sobald es irgendein neueres Release für dieses Modell gibt.",
"options": {
"sameBase": "Updates nach Basismodell abgleichen",
"any": "Jede verfügbare Aktualisierung markieren"
}
},
"misc": {
"includeTriggerWords": "Trigger Words in LoRA-Syntax einschließen",
"includeTriggerWordsHelp": "Trainierte Trigger Words beim Kopieren der LoRA-Syntax in die Zwischenablage einschließen"
@@ -472,6 +506,7 @@
},
"contextMenu": {
"refreshMetadata": "Civitai-Daten aktualisieren",
"checkUpdates": "Updates prüfen",
"relinkCivitai": "Mit Civitai neu verknüpfen",
"copySyntax": "LoRA-Syntax kopieren",
"copyFilename": "Modell-Dateiname kopieren",
@@ -493,6 +528,9 @@
},
"recipes": {
"title": "LoRA-Rezepte",
"actions": {
"sendCheckpoint": "Send to ComfyUI"
},
"controls": {
"import": {
"action": "Importieren",
@@ -876,6 +914,16 @@
"recipes": "Rezepte",
"versions": "Versionen"
},
"license": {
"noImageSell": "No selling generated content",
"noRentCivit": "No Civitai generation",
"noRent": "No generation services",
"noSell": "No selling models",
"creditRequired": "Ersteller-Angabe erforderlich",
"noDerivatives": "Keine gemeinsamen Zusammenführungen",
"noReLicense": "Gleiche Berechtigungen erforderlich",
"restrictionsLabel": "Lizenzbeschränkungen"
},
"loading": {
"exampleImages": "Beispielbilder werden geladen...",
"description": "Modellbeschreibung wird geladen...",
@@ -909,6 +957,18 @@
"viewLocalVersions": "Alle lokalen Versionen anzeigen",
"viewLocalTooltip": "Demnächst verfügbar"
},
"filters": {
"label": "Basisfilter",
"state": {
"showAll": "Alle Versionen",
"showSameBase": "Gleiches Basismodell"
},
"tooltip": {
"showAllVersions": "Wechseln, um alle Versionen anzuzeigen",
"showSameBaseVersions": "Wechseln, um nur Versionen mit demselben Basismodell anzuzeigen"
},
"empty": "Keine Versionen entsprechen dem Filter für das aktuelle Basismodell."
},
"empty": "Noch keine Versionshistorie für dieses Modell vorhanden.",
"error": "Versionen konnten nicht geladen werden.",
"missingModelId": "Für dieses Modell ist keine Civitai-Model-ID vorhanden.",
@@ -1197,6 +1257,9 @@
"cannotSend": "Kann Rezept nicht senden: Fehlende Rezept-ID",
"sendFailed": "Fehler beim Senden des Rezepts an Workflow",
"sendError": "Fehler beim Senden des Rezepts an Workflow",
"missingCheckpointPath": "Checkpoint-Pfad nicht verfügbar",
"missingCheckpointInfo": "Checkpoint-Informationen fehlen",
"downloadCheckpointFailed": "Checkpoint-Download fehlgeschlagen: {message}",
"cannotDelete": "Kann Rezept nicht löschen: Fehlende Rezept-ID",
"deleteConfirmationError": "Fehler beim Anzeigen der Löschbestätigung",
"deletedSuccessfully": "Rezept erfolgreich gelöscht",
@@ -1303,7 +1366,7 @@
},
"triggerWords": {
"loadFailed": "Konnte trainierte Wörter nicht laden",
"tooLong": "Trigger Word sollte 30 Wörter nicht überschreiten",
"tooLong": "Trigger Word sollte 100 Wörter nicht überschreiten",
"tooMany": "Maximal 30 Trigger Words erlaubt",
"alreadyExists": "Dieses Trigger Word existiert bereits",
"updateSuccess": "Trigger Words erfolgreich aktualisiert",

View File

@@ -152,6 +152,13 @@
"none": "No example image folders needed cleanup",
"partial": "Cleanup completed with {failures} folder(s) skipped",
"error": "Failed to clean example image folders: {message}"
},
"fetchMissingLicenses": {
"label": "Refresh license metadata",
"loading": "Refreshing license metadata for {typePlural}...",
"success": "Updated license metadata for {count} {typePlural}",
"none": "All {typePlural} already have license metadata",
"error": "Failed to refresh license metadata for {typePlural}: {message}"
}
},
"header": {
@@ -188,6 +195,10 @@
"title": "Filter Models",
"baseModel": "Base Model",
"modelTags": "Tags (Top 20)",
"modelTypes": "Model Types",
"license": "License",
"noCreditRequired": "No Credit Required",
"allowSellingGeneratedContent": "Allow Selling",
"clearAll": "Clear All Filters"
},
"theme": {
@@ -220,10 +231,17 @@
"priorityTags": "Priority Tags",
"downloadPathTemplates": "Download Path Templates",
"exampleImages": "Example Images",
"updateFlags": "Update Flags",
"autoOrganize": "Auto-organize",
"misc": "Misc.",
"metadataArchive": "Metadata Archive Database",
"storageLocation": "Settings Location",
"proxySettings": "Proxy Settings"
},
"storage": {
"locationLabel": "Portable mode",
"locationHelp": "Enable to keep settings.json inside the repository; disable to store it in your user config directory."
},
"contentFiltering": {
"blurNsfwContent": "Blur NSFW Content",
"blurNsfwContentHelp": "Blur mature (NSFW) content preview images",
@@ -234,6 +252,15 @@
"autoplayOnHover": "Autoplay Videos on Hover",
"autoplayOnHoverHelp": "Only play video previews when hovering over them"
},
"autoOrganizeExclusions": {
"label": "Auto-organize exclusions",
"placeholder": "Example: curated/*, */backups/*; *_temp.safetensors",
"help": "Skip moving files that match these wildcard patterns. Separate multiple patterns with commas or semicolons.",
"validation": {
"noPatterns": "Enter at least one pattern separated by commas or semicolons.",
"saveFailed": "Unable to save exclusions: {message}"
}
},
"layoutSettings": {
"displayDensity": "Display Density",
"displayDensityOptions": {
@@ -349,6 +376,14 @@
"download": "Download",
"restartRequired": "Requires restart"
},
"updateFlagStrategy": {
"label": "Update Flag Strategy",
"help": "Decide whether update badges should only appear when a new release shares the same base model as your local files or whenever any newer version exists for that model.",
"options": {
"sameBase": "Match updates by base model",
"any": "Flag any available update"
}
},
"misc": {
"includeTriggerWords": "Include Trigger Words in LoRA Syntax",
"includeTriggerWordsHelp": "Include trained trigger words when copying LoRA syntax to clipboard"
@@ -471,6 +506,7 @@
},
"contextMenu": {
"refreshMetadata": "Refresh Civitai Data",
"checkUpdates": "Check Updates",
"relinkCivitai": "Re-link to Civitai",
"copySyntax": "Copy LoRA Syntax",
"copyFilename": "Copy Model Filename",
@@ -492,6 +528,9 @@
},
"recipes": {
"title": "LoRA Recipes",
"actions": {
"sendCheckpoint": "Send to ComfyUI"
},
"controls": {
"import": {
"action": "Import",
@@ -875,6 +914,16 @@
"recipes": "Recipes",
"versions": "Versions"
},
"license": {
"noImageSell": "No selling generated content",
"noRentCivit": "No Civitai generation",
"noRent": "No generation services",
"noSell": "No selling models",
"creditRequired": "Creator credit required",
"noDerivatives": "No sharing merges",
"noReLicense": "Same permissions required",
"restrictionsLabel": "License restrictions"
},
"loading": {
"exampleImages": "Loading example images...",
"description": "Loading model description...",
@@ -908,6 +957,18 @@
"viewLocalVersions": "View all local versions",
"viewLocalTooltip": "Coming soon"
},
"filters": {
"label": "Base filter",
"state": {
"showAll": "All versions",
"showSameBase": "Same base"
},
"tooltip": {
"showAllVersions": "Switch to showing all versions",
"showSameBaseVersions": "Switch to showing only versions that match the current base model"
},
"empty": "No versions match the current base model filter."
},
"empty": "No version history available for this model yet.",
"error": "Failed to load versions.",
"missingModelId": "This model is missing a Civitai model id.",
@@ -1196,6 +1257,9 @@
"cannotSend": "Cannot send recipe: Missing recipe ID",
"sendFailed": "Failed to send recipe to workflow",
"sendError": "Error sending recipe to workflow",
"missingCheckpointPath": "Checkpoint path not available",
"missingCheckpointInfo": "Missing checkpoint information",
"downloadCheckpointFailed": "Failed to download checkpoint: {message}",
"cannotDelete": "Cannot delete recipe: Missing recipe ID",
"deleteConfirmationError": "Error showing delete confirmation",
"deletedSuccessfully": "Recipe deleted successfully",
@@ -1302,7 +1366,7 @@
},
"triggerWords": {
"loadFailed": "Could not load trained words",
"tooLong": "Trigger word should not exceed 30 words",
"tooLong": "Trigger word should not exceed 100 words",
"tooMany": "Maximum 30 trigger words allowed",
"alreadyExists": "This trigger word already exists",
"updateSuccess": "Trigger words updated successfully",

View File

@@ -152,6 +152,13 @@
"none": "No hay carpetas de imágenes de ejemplo que necesiten limpieza",
"partial": "Limpieza completada con {failures} carpeta(s) omitidas",
"error": "No se pudieron limpiar las carpetas de imágenes de ejemplo: {message}"
},
"fetchMissingLicenses": {
"label": "Refresh license metadata",
"loading": "Refreshing license metadata for {typePlural}...",
"success": "Updated license metadata for {count} {typePlural}",
"none": "All {typePlural} already have license metadata",
"error": "Failed to refresh license metadata for {typePlural}: {message}"
}
},
"header": {
@@ -188,6 +195,10 @@
"title": "Filtrar modelos",
"baseModel": "Modelo base",
"modelTags": "Etiquetas (Top 20)",
"modelTypes": "Model Types",
"license": "Licencia",
"noCreditRequired": "Sin crédito requerido",
"allowSellingGeneratedContent": "Venta permitida",
"clearAll": "Limpiar todos los filtros"
},
"theme": {
@@ -220,10 +231,17 @@
"priorityTags": "Etiquetas prioritarias",
"downloadPathTemplates": "Plantillas de rutas de descarga",
"exampleImages": "Imágenes de ejemplo",
"updateFlags": "Indicadores de actualización",
"autoOrganize": "Auto-organize",
"misc": "Varios",
"metadataArchive": "Base de datos de archivo de metadatos",
"storageLocation": "Ubicación de ajustes",
"proxySettings": "Configuración de proxy"
},
"storage": {
"locationLabel": "Modo portátil",
"locationHelp": "Activa para mantener settings.json dentro del repositorio; desactívalo para guardarlo en tu directorio de configuración de usuario."
},
"contentFiltering": {
"blurNsfwContent": "Difuminar contenido NSFW",
"blurNsfwContentHelp": "Difuminar imágenes de vista previa de contenido para adultos (NSFW)",
@@ -234,6 +252,15 @@
"autoplayOnHover": "Reproducir videos automáticamente al pasar el ratón",
"autoplayOnHoverHelp": "Solo reproducir vistas previas de video al pasar el ratón sobre ellas"
},
"autoOrganizeExclusions": {
"label": "Exclusiones de auto-organización",
"placeholder": "Ejemplo: curated/*, */backups/*; *_temp.safetensors",
"help": "Omitir archivos que coincidan con estos patrones comodín. Separe múltiples patrones con comas o puntos y comas.",
"validation": {
"noPatterns": "Ingrese al menos un patrón separado por comas o puntos y comas.",
"saveFailed": "No se pudieron guardar las exclusiones: {message}"
}
},
"layoutSettings": {
"displayDensity": "Densidad de visualización",
"displayDensityOptions": {
@@ -349,6 +376,14 @@
"download": "Descargar",
"restartRequired": "Requiere reinicio"
},
"updateFlagStrategy": {
"label": "Estrategia de indicadores de actualización",
"help": "Decide si las insignias de actualización deben mostrarse solo cuando una nueva versión comparte el mismo modelo base que tus archivos locales o siempre que exista cualquier versión más reciente de ese modelo.",
"options": {
"sameBase": "Coincidir actualizaciones por modelo base",
"any": "Marcar cualquier actualización disponible"
}
},
"misc": {
"includeTriggerWords": "Incluir palabras clave en la sintaxis de LoRA",
"includeTriggerWordsHelp": "Incluir palabras clave entrenadas al copiar la sintaxis de LoRA al portapapeles"
@@ -471,6 +506,7 @@
},
"contextMenu": {
"refreshMetadata": "Actualizar datos de Civitai",
"checkUpdates": "Comprobar actualizaciones",
"relinkCivitai": "Re-vincular a Civitai",
"copySyntax": "Copiar sintaxis de LoRA",
"copyFilename": "Copiar nombre de archivo del modelo",
@@ -492,6 +528,9 @@
},
"recipes": {
"title": "Recetas de LoRA",
"actions": {
"sendCheckpoint": "Enviar a ComfyUI"
},
"controls": {
"import": {
"action": "Importar",
@@ -875,6 +914,16 @@
"recipes": "Recetas",
"versions": "Versiones"
},
"license": {
"noImageSell": "No selling generated content",
"noRentCivit": "No Civitai generation",
"noRent": "No generation services",
"noSell": "No selling models",
"creditRequired": "Crédito del creador requerido",
"noDerivatives": "No se permiten fusiones",
"noReLicense": "Se requieren mismos permisos",
"restrictionsLabel": "Restricciones de licencia"
},
"loading": {
"exampleImages": "Cargando imágenes de ejemplo...",
"description": "Cargando descripción del modelo...",
@@ -908,6 +957,18 @@
"viewLocalVersions": "Ver todas las versiones locales",
"viewLocalTooltip": "Disponible pronto"
},
"filters": {
"label": "Filtro base",
"state": {
"showAll": "Todas las versiones",
"showSameBase": "Mismo modelo base"
},
"tooltip": {
"showAllVersions": "Cambiar para mostrar todas las versiones",
"showSameBaseVersions": "Cambiar para mostrar solo versiones del mismo modelo base"
},
"empty": "Ninguna versión coincide con el filtro del modelo base actual."
},
"empty": "Aún no hay historial de versiones para este modelo.",
"error": "No se pudieron cargar las versiones.",
"missingModelId": "Este modelo no tiene un ID de modelo de Civitai.",
@@ -1196,6 +1257,9 @@
"cannotSend": "No se puede enviar receta: Falta ID de receta",
"sendFailed": "Error al enviar receta al flujo de trabajo",
"sendError": "Error enviando receta al flujo de trabajo",
"missingCheckpointPath": "Ruta del checkpoint no disponible",
"missingCheckpointInfo": "Falta información del checkpoint",
"downloadCheckpointFailed": "Error al descargar el checkpoint: {message}",
"cannotDelete": "No se puede eliminar receta: Falta ID de receta",
"deleteConfirmationError": "Error mostrando confirmación de eliminación",
"deletedSuccessfully": "Receta eliminada exitosamente",
@@ -1302,7 +1366,7 @@
},
"triggerWords": {
"loadFailed": "No se pudieron cargar palabras entrenadas",
"tooLong": "La palabra clave no debe exceder 30 palabras",
"tooLong": "La palabra clave no debe exceder 100 palabras",
"tooMany": "Máximo 30 palabras clave permitidas",
"alreadyExists": "Esta palabra clave ya existe",
"updateSuccess": "Palabras clave actualizadas exitosamente",

View File

@@ -152,6 +152,13 @@
"none": "Aucun dossier d'images d'exemple à nettoyer",
"partial": "Nettoyage terminé avec {failures} dossier(s) ignoré(s)",
"error": "Échec du nettoyage des dossiers d'images d'exemple : {message}"
},
"fetchMissingLicenses": {
"label": "Refresh license metadata",
"loading": "Refreshing license metadata for {typePlural}...",
"success": "Updated license metadata for {count} {typePlural}",
"none": "All {typePlural} already have license metadata",
"error": "Failed to refresh license metadata for {typePlural}: {message}"
}
},
"header": {
@@ -188,6 +195,10 @@
"title": "Filtrer les modèles",
"baseModel": "Modèle de base",
"modelTags": "Tags (Top 20)",
"modelTypes": "Model Types",
"license": "Licence",
"noCreditRequired": "Crédit non requis",
"allowSellingGeneratedContent": "Vente autorisée",
"clearAll": "Effacer tous les filtres"
},
"theme": {
@@ -217,12 +228,19 @@
"videoSettings": "Paramètres vidéo",
"layoutSettings": "Paramètres d'affichage",
"folderSettings": "Paramètres des dossiers",
"priorityTags": "Étiquettes prioritaires",
"downloadPathTemplates": "Modèles de chemin de téléchargement",
"exampleImages": "Images d'exemple",
"updateFlags": "Indicateurs de mise à jour",
"autoOrganize": "Auto-organize",
"misc": "Divers",
"metadataArchive": "Base de données d'archive des métadonnées",
"proxySettings": "Paramètres du proxy",
"priorityTags": "Étiquettes prioritaires"
"storageLocation": "Emplacement des paramètres",
"proxySettings": "Paramètres du proxy"
},
"storage": {
"locationLabel": "Mode portable",
"locationHelp": "Activez pour garder settings.json dans le dépôt ; désactivez pour le placer dans votre dossier de configuration utilisateur."
},
"contentFiltering": {
"blurNsfwContent": "Flouter le contenu NSFW",
@@ -234,6 +252,15 @@
"autoplayOnHover": "Lecture automatique vidéo au survol",
"autoplayOnHoverHelp": "Lire les aperçus vidéo uniquement lors du survol"
},
"autoOrganizeExclusions": {
"label": "Exclusions de l'auto-organisation",
"placeholder": "Exemple : curated/*, */backups/*; *_temp.safetensors",
"help": "Ignorer les fichiers correspondant à ces motifs génériques. Séparez plusieurs motifs par des virgules ou des points-virgules.",
"validation": {
"noPatterns": "Entrez au moins un motif séparé par des virgules ou des points-virgules.",
"saveFailed": "Impossible d'enregistrer les exclusions : {message}"
}
},
"layoutSettings": {
"displayDensity": "Densité d'affichage",
"displayDensityOptions": {
@@ -282,6 +309,26 @@
"defaultEmbeddingRootHelp": "Définir le répertoire racine embedding par défaut pour les téléchargements, imports et déplacements",
"noDefault": "Aucun par défaut"
},
"priorityTags": {
"title": "Étiquettes prioritaires",
"description": "Personnalisez l'ordre de priorité des étiquettes pour chaque type de modèle (par ex. : character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "Ouvrir l'aide sur les étiquettes prioritaires",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Checkpoint",
"embedding": "Embedding"
},
"saveSuccess": "Étiquettes prioritaires mises à jour.",
"saveError": "Échec de la mise à jour des étiquettes prioritaires.",
"loadingSuggestions": "Chargement des suggestions...",
"validation": {
"missingClosingParen": "L'entrée {index} n'a pas de parenthèse fermante.",
"missingCanonical": "L'entrée {index} doit inclure un nom d'étiquette canonique.",
"duplicateCanonical": "L'étiquette canonique \"{tag}\" apparaît plusieurs fois.",
"unknown": "Configuration d'étiquettes prioritaires invalide."
}
},
"downloadPathTemplates": {
"title": "Modèles de chemin de téléchargement",
"help": "Configurer les structures de dossiers pour différents types de modèles lors du téléchargement depuis Civitai.",
@@ -329,6 +376,14 @@
"download": "Télécharger",
"restartRequired": "Redémarrage requis"
},
"updateFlagStrategy": {
"label": "Stratégie des indicateurs de mise à jour",
"help": "Choisissez si les badges de mise à jour doivent apparaître uniquement lorsquune nouvelle version partage le même modèle de base que vos fichiers locaux, ou dès quil existe une version plus récente pour ce modèle.",
"options": {
"sameBase": "Faire correspondre les mises à jour par modèle de base",
"any": "Signaler nimporte quelle mise à jour disponible"
}
},
"misc": {
"includeTriggerWords": "Inclure les mots-clés dans la syntaxe LoRA",
"includeTriggerWordsHelp": "Inclure les mots-clés d'entraînement lors de la copie de la syntaxe LoRA dans le presse-papiers"
@@ -374,26 +429,6 @@
"proxyPassword": "Mot de passe (optionnel)",
"proxyPasswordPlaceholder": "mot_de_passe",
"proxyPasswordHelp": "Mot de passe pour l'authentification proxy (si nécessaire)"
},
"priorityTags": {
"title": "Étiquettes prioritaires",
"description": "Personnalisez l'ordre de priorité des étiquettes pour chaque type de modèle (par ex. : character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "Ouvrir l'aide sur les étiquettes prioritaires",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Checkpoint",
"embedding": "Embedding"
},
"saveSuccess": "Étiquettes prioritaires mises à jour.",
"saveError": "Échec de la mise à jour des étiquettes prioritaires.",
"loadingSuggestions": "Chargement des suggestions...",
"validation": {
"missingClosingParen": "L'entrée {index} n'a pas de parenthèse fermante.",
"missingCanonical": "L'entrée {index} doit inclure un nom d'étiquette canonique.",
"duplicateCanonical": "L'étiquette canonique \"{tag}\" apparaît plusieurs fois.",
"unknown": "Configuration d'étiquettes prioritaires invalide."
}
}
},
"loras": {
@@ -471,6 +506,7 @@
},
"contextMenu": {
"refreshMetadata": "Actualiser les données Civitai",
"checkUpdates": "Vérifier les mises à jour",
"relinkCivitai": "Relier à nouveau à Civitai",
"copySyntax": "Copier la syntaxe LoRA",
"copyFilename": "Copier le nom de fichier du modèle",
@@ -492,6 +528,9 @@
},
"recipes": {
"title": "LoRA Recipes",
"actions": {
"sendCheckpoint": "Envoyer vers ComfyUI"
},
"controls": {
"import": {
"action": "Importer",
@@ -875,6 +914,16 @@
"recipes": "Recipes",
"versions": "Versions"
},
"license": {
"noImageSell": "No selling generated content",
"noRentCivit": "No Civitai generation",
"noRent": "No generation services",
"noSell": "No selling models",
"creditRequired": "Crédit du créateur requis",
"noDerivatives": "Pas de fusion de partage",
"noReLicense": "Mêmes autorisations requises",
"restrictionsLabel": "Restrictions de licence"
},
"loading": {
"exampleImages": "Chargement des images d'exemple...",
"description": "Chargement de la description du modèle...",
@@ -908,6 +957,18 @@
"viewLocalVersions": "Voir toutes les versions locales",
"viewLocalTooltip": "Bientôt disponible"
},
"filters": {
"label": "Filtre de base",
"state": {
"showAll": "Toutes les versions",
"showSameBase": "Même modèle de base"
},
"tooltip": {
"showAllVersions": "Passer à l'affichage de toutes les versions",
"showSameBaseVersions": "Passer à l'affichage des versions du même modèle de base"
},
"empty": "Aucune version ne correspond au filtre du modèle de base actuel."
},
"empty": "Aucun historique de versions n'est disponible pour ce modèle pour le moment.",
"error": "Échec du chargement des versions.",
"missingModelId": "Ce modèle ne possède pas d'identifiant de modèle Civitai.",
@@ -1196,6 +1257,9 @@
"cannotSend": "Impossible d'envoyer la recipe : ID de recipe manquant",
"sendFailed": "Échec de l'envoi de la recipe vers le workflow",
"sendError": "Erreur lors de l'envoi de la recipe vers le workflow",
"missingCheckpointPath": "Chemin du checkpoint indisponible",
"missingCheckpointInfo": "Informations sur le checkpoint manquantes",
"downloadCheckpointFailed": "Échec du téléchargement du checkpoint : {message}",
"cannotDelete": "Impossible de supprimer la recipe : ID de recipe manquant",
"deleteConfirmationError": "Erreur lors de l'affichage de la confirmation de suppression",
"deletedSuccessfully": "Recipe supprimée avec succès",
@@ -1302,7 +1366,7 @@
},
"triggerWords": {
"loadFailed": "Impossible de charger les mots entraînés",
"tooLong": "Le mot-clé ne doit pas dépasser 30 mots",
"tooLong": "Le mot-clé ne doit pas dépasser 100 mots",
"tooMany": "Maximum 30 mots-clés autorisés",
"alreadyExists": "Ce mot-clé existe déjà",
"updateSuccess": "Mots-clés mis à jour avec succès",

View File

@@ -152,6 +152,13 @@
"none": "אין תיקיות תמונות דוגמה שזקוקות לניקוי",
"partial": "הניקוי הושלם עם דילוג על {failures} תיקיות",
"error": "ניקוי תיקיות תמונות הדוגמה נכשל: {message}"
},
"fetchMissingLicenses": {
"label": "Refresh license metadata",
"loading": "Refreshing license metadata for {typePlural}...",
"success": "Updated license metadata for {count} {typePlural}",
"none": "All {typePlural} already have license metadata",
"error": "Failed to refresh license metadata for {typePlural}: {message}"
}
},
"header": {
@@ -188,6 +195,10 @@
"title": "סנן מודלים",
"baseModel": "מודל בסיס",
"modelTags": "תגיות (20 המובילות)",
"modelTypes": "Model Types",
"license": "רישיון",
"noCreditRequired": "ללא קרדיט נדרש",
"allowSellingGeneratedContent": "אפשר מכירה",
"clearAll": "נקה את כל המסננים"
},
"theme": {
@@ -219,11 +230,18 @@
"folderSettings": "הגדרות תיקייה",
"downloadPathTemplates": "תבניות נתיב הורדה",
"exampleImages": "תמונות דוגמה",
"updateFlags": "תגי עדכון",
"autoOrganize": "Auto-organize",
"misc": "שונות",
"metadataArchive": "מסד נתונים של ארכיון מטא-דאטה",
"storageLocation": "מיקום ההגדרות",
"proxySettings": "הגדרות פרוקסי",
"priorityTags": "תגיות עדיפות"
},
"storage": {
"locationLabel": "מצב נייד",
"locationHelp": "הפעל כדי לשמור את settings.json בתוך המאגר; בטל כדי לשמור אותו בתיקיית ההגדרות של המשתמש."
},
"contentFiltering": {
"blurNsfwContent": "טשטש תוכן NSFW",
"blurNsfwContentHelp": "טשטש תמונות תצוגה מקדימה של תוכן למבוגרים (NSFW)",
@@ -234,6 +252,15 @@
"autoplayOnHover": "נגן וידאו אוטומטית בריחוף",
"autoplayOnHoverHelp": "נגן תצוגות מקדימות של וידאו רק בעת ריחוף מעליהן"
},
"autoOrganizeExclusions": {
"label": "יוצא דופן של ארגון אוטומטי",
"placeholder": "דוגמה: curated/*, */backups/*; *_temp.safetensors",
"help": "דלג על העברת קבצים התואמים לתבניות אלו. הפרד תבניות מרובות בפסיקים או בנקודותיים.",
"validation": {
"noPatterns": "הזן לפחות תבנית אחת מופרדת בפסיקים או בנקודותיים.",
"saveFailed": "לא ניתן לשמור את ההוצאות: {message}"
}
},
"layoutSettings": {
"displayDensity": "צפיפות תצוגה",
"displayDensityOptions": {
@@ -329,6 +356,14 @@
"download": "הורד",
"restartRequired": "דורש הפעלה מחדש"
},
"updateFlagStrategy": {
"label": "אסטרטגיית תגי עדכון",
"help": "בחרו אם תוויות העדכון יוצגו רק כאשר גרסה חדשה חולקת את אותו דגם בסיס כמו הקבצים המקומיים שלכם או בכל מקרה שבו קיימת גרסה חדשה עבור אותו דגם.",
"options": {
"sameBase": "התאמת עדכונים לפי דגם בסיס",
"any": "תוויות לכל עדכון זמין"
}
},
"misc": {
"includeTriggerWords": "כלול מילות טריגר בתחביר LoRA",
"includeTriggerWordsHelp": "כלול מילות טריגר מאומנות בעת העתקת תחביר LoRA ללוח"
@@ -471,6 +506,7 @@
},
"contextMenu": {
"refreshMetadata": "רענן נתוני Civitai",
"checkUpdates": "בדוק עדכונים",
"relinkCivitai": "קשר מחדש ל-Civitai",
"copySyntax": "העתק תחביר LoRA",
"copyFilename": "העתק שם קובץ מודל",
@@ -492,6 +528,9 @@
},
"recipes": {
"title": "מתכוני LoRA",
"actions": {
"sendCheckpoint": "שלח ל-ComfyUI"
},
"controls": {
"import": {
"action": "ייבא",
@@ -875,6 +914,16 @@
"recipes": "מתכונים",
"versions": "גרסאות"
},
"license": {
"noImageSell": "No selling generated content",
"noRentCivit": "No Civitai generation",
"noRent": "No generation services",
"noSell": "No selling models",
"creditRequired": "נדרש ייחוס ליוצר",
"noDerivatives": "אין שיתוף מיזוגים",
"noReLicense": "נדרשות אותן הרשאות",
"restrictionsLabel": "הגבלות רישיון"
},
"loading": {
"exampleImages": "טוען תמונות דוגמה...",
"description": "טוען תיאור מודל...",
@@ -908,6 +957,18 @@
"viewLocalVersions": "הצג את כל הגרסאות המקומיות",
"viewLocalTooltip": "יגיע בקרוב"
},
"filters": {
"label": "מסנן בסיס",
"state": {
"showAll": "כל הגרסאות",
"showSameBase": "אותו מודל בסיס"
},
"tooltip": {
"showAllVersions": "החלף להצגת כל הגרסאות",
"showSameBaseVersions": "החלף להצגת גרסאות עם אותו מודל בסיס"
},
"empty": "אין גרסאות התואמות את המסנן של מודל הבסיס הנוכחי."
},
"empty": "אין עדיין היסטוריית גרסאות למודל זה.",
"error": "טעינת הגרסאות נכשלה.",
"missingModelId": "למודל זה אין מזהה מודל של Civitai.",
@@ -1196,6 +1257,9 @@
"cannotSend": "לא ניתן לשלוח מתכון: חסר מזהה מתכון",
"sendFailed": "שליחת המתכון ל-workflow נכשלה",
"sendError": "שגיאה בשליחת המתכון ל-workflow",
"missingCheckpointPath": "נתיב ה-checkpoint אינו זמין",
"missingCheckpointInfo": "חסרים פרטי checkpoint",
"downloadCheckpointFailed": "הורדת checkpoint נכשלה: {message}",
"cannotDelete": "לא ניתן למחוק מתכון: חסר מזהה מתכון",
"deleteConfirmationError": "שגיאה בהצגת אישור המחיקה",
"deletedSuccessfully": "המתכון נמחק בהצלחה",
@@ -1302,7 +1366,7 @@
},
"triggerWords": {
"loadFailed": "לא ניתן היה לטעון מילים מאומנות",
"tooLong": "מילת טריגר לא תעלה על 30 מילים",
"tooLong": "מילת טריגר לא תעלה על 100 מילים",
"tooMany": "מותרות עד 30 מילות טריגר",
"alreadyExists": "מילת טריגר זו כבר קיימת",
"updateSuccess": "מילות הטריגר עודכנו בהצלחה",

View File

@@ -152,6 +152,13 @@
"none": "クリーンアップが必要な例画像フォルダはありません",
"partial": "クリーンアップが完了しましたが、{failures} 個のフォルダはスキップされました",
"error": "例画像フォルダのクリーンアップに失敗しました:{message}"
},
"fetchMissingLicenses": {
"label": "Refresh license metadata",
"loading": "Refreshing license metadata for {typePlural}...",
"success": "Updated license metadata for {count} {typePlural}",
"none": "All {typePlural} already have license metadata",
"error": "Failed to refresh license metadata for {typePlural}: {message}"
}
},
"header": {
@@ -188,6 +195,10 @@
"title": "モデルをフィルタ",
"baseModel": "ベースモデル",
"modelTags": "タグ上位20",
"modelTypes": "Model Types",
"license": "ライセンス",
"noCreditRequired": "クレジット不要",
"allowSellingGeneratedContent": "販売許可",
"clearAll": "すべてのフィルタをクリア"
},
"theme": {
@@ -217,12 +228,19 @@
"videoSettings": "動画設定",
"layoutSettings": "レイアウト設定",
"folderSettings": "フォルダ設定",
"priorityTags": "優先タグ",
"downloadPathTemplates": "ダウンロードパステンプレート",
"exampleImages": "例画像",
"updateFlags": "アップデートフラグ",
"autoOrganize": "Auto-organize",
"misc": "その他",
"metadataArchive": "メタデータアーカイブデータベース",
"proxySettings": "プロキシ設定",
"priorityTags": "優先タグ"
"storageLocation": "設定の場所",
"proxySettings": "プロキシ設定"
},
"storage": {
"locationLabel": "ポータブルモード",
"locationHelp": "有効にすると settings.json をリポジトリ内に保持し、無効にするとユーザー設定ディレクトリに格納します。"
},
"contentFiltering": {
"blurNsfwContent": "NSFWコンテンツをぼかす",
@@ -234,6 +252,15 @@
"autoplayOnHover": "ホバー時に動画を自動再生",
"autoplayOnHoverHelp": "動画プレビューはホバー時にのみ再生されます"
},
"autoOrganizeExclusions": {
"label": "自動整理除外設定",
"placeholder": "例: curated/*, */backups/*; *_temp.safetensors",
"help": "これらのワイルドカードパターンに一致するファイルの移動をスキップします。複数のパターンはカンマまたはセミコロンで区切ってください。",
"validation": {
"noPatterns": "カンマまたはセミコロンで区切られた少なくとも1つのパターンを入力してください。",
"saveFailed": "除外設定を保存できませんでした: {message}"
}
},
"layoutSettings": {
"displayDensity": "表示密度",
"displayDensityOptions": {
@@ -282,6 +309,26 @@
"defaultEmbeddingRootHelp": "ダウンロード、インポート、移動用のデフォルトembeddingルートディレクトリを設定",
"noDefault": "デフォルトなし"
},
"priorityTags": {
"title": "優先タグ",
"description": "各モデルタイプのタグ優先順位をカスタマイズします (例: character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "優先タグのヘルプを開く",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "チェックポイント",
"embedding": "埋め込み"
},
"saveSuccess": "優先タグを更新しました。",
"saveError": "優先タグの更新に失敗しました。",
"loadingSuggestions": "候補を読み込み中...",
"validation": {
"missingClosingParen": "エントリ {index} に閉じ括弧がありません。",
"missingCanonical": "エントリ {index} には正規タグ名を含める必要があります。",
"duplicateCanonical": "正規タグ \"{tag}\" が複数回登場しています。",
"unknown": "無効な優先タグ設定です。"
}
},
"downloadPathTemplates": {
"title": "ダウンロードパステンプレート",
"help": "Civitaiからダウンロードする際の異なるモデルタイプのフォルダ構造を設定します。",
@@ -329,6 +376,14 @@
"download": "ダウンロード",
"restartRequired": "再起動が必要"
},
"updateFlagStrategy": {
"label": "アップデートフラグの表示戦略",
"help": "新リリースがローカルファイルと同じベースモデルを共有する場合にのみ更新バッジを表示するか、そのモデルに新しいバージョンがあれば常に表示するかを決めます。",
"options": {
"sameBase": "ベースモデルで更新をマッチ",
"any": "利用可能な更新すべてを表示"
}
},
"misc": {
"includeTriggerWords": "LoRA構文にトリガーワードを含める",
"includeTriggerWordsHelp": "LoRA構文をクリップボードにコピーする際、学習済みトリガーワードを含めます"
@@ -374,26 +429,6 @@
"proxyPassword": "パスワード(任意)",
"proxyPasswordPlaceholder": "パスワード",
"proxyPasswordHelp": "プロキシ認証用のパスワード(必要な場合)"
},
"priorityTags": {
"title": "優先タグ",
"description": "各モデルタイプのタグ優先順位をカスタマイズします (例: character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "優先タグのヘルプを開く",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "チェックポイント",
"embedding": "埋め込み"
},
"saveSuccess": "優先タグを更新しました。",
"saveError": "優先タグの更新に失敗しました。",
"loadingSuggestions": "候補を読み込み中...",
"validation": {
"missingClosingParen": "エントリ {index} に閉じ括弧がありません。",
"missingCanonical": "エントリ {index} には正規タグ名を含める必要があります。",
"duplicateCanonical": "正規タグ \"{tag}\" が複数回登場しています。",
"unknown": "無効な優先タグ設定です。"
}
}
},
"loras": {
@@ -471,6 +506,7 @@
},
"contextMenu": {
"refreshMetadata": "Civitaiデータを更新",
"checkUpdates": "更新確認",
"relinkCivitai": "Civitaiに再リンク",
"copySyntax": "LoRA構文をコピー",
"copyFilename": "モデルファイル名をコピー",
@@ -492,6 +528,9 @@
},
"recipes": {
"title": "LoRAレシピ",
"actions": {
"sendCheckpoint": "ComfyUIへ送信"
},
"controls": {
"import": {
"action": "インポート",
@@ -875,6 +914,16 @@
"recipes": "レシピ",
"versions": "バージョン"
},
"license": {
"noImageSell": "No selling generated content",
"noRentCivit": "No Civitai generation",
"noRent": "No generation services",
"noSell": "No selling models",
"creditRequired": "作成者のクレジットが必要",
"noDerivatives": "共有マージ不可",
"noReLicense": "同じ権限が必要",
"restrictionsLabel": "ライセンス制限"
},
"loading": {
"exampleImages": "例画像を読み込み中...",
"description": "モデル説明を読み込み中...",
@@ -908,6 +957,18 @@
"viewLocalVersions": "ローカルの全バージョンを表示",
"viewLocalTooltip": "近日対応予定"
},
"filters": {
"label": "ベースフィルター",
"state": {
"showAll": "すべてのバージョン",
"showSameBase": "同じベース"
},
"tooltip": {
"showAllVersions": "すべてのバージョンを表示する",
"showSameBaseVersions": "同じベースモデルのバージョンのみ表示する"
},
"empty": "現在のベースモデルフィルターに一致するバージョンがありません。"
},
"empty": "このモデルにはまだバージョン履歴がありません。",
"error": "バージョンの読み込みに失敗しました。",
"missingModelId": "このモデルにはCivitaiのモデルIDがありません。",
@@ -1196,6 +1257,9 @@
"cannotSend": "レシピを送信できませんレシピIDがありません",
"sendFailed": "レシピのワークフローへの送信に失敗しました",
"sendError": "レシピのワークフロー送信エラー",
"missingCheckpointPath": "チェックポイントのパスがありません",
"missingCheckpointInfo": "チェックポイント情報が不足しています",
"downloadCheckpointFailed": "チェックポイントのダウンロードに失敗しました: {message}",
"cannotDelete": "レシピを削除できませんレシピIDがありません",
"deleteConfirmationError": "削除確認の表示中にエラーが発生しました",
"deletedSuccessfully": "レシピが正常に削除されました",
@@ -1302,7 +1366,7 @@
},
"triggerWords": {
"loadFailed": "学習済みワードを読み込めませんでした",
"tooLong": "トリガーワードは30ワードを超えてはいけません",
"tooLong": "トリガーワードは100ワードを超えてはいけません",
"tooMany": "最大30トリガーワードまで許可されています",
"alreadyExists": "このトリガーワードは既に存在します",
"updateSuccess": "トリガーワードが正常に更新されました",

View File

@@ -152,6 +152,13 @@
"none": "정리가 필요한 예시 이미지 폴더가 없습니다",
"partial": "정리가 완료되었으나 {failures}개의 폴더가 건너뛰어졌습니다",
"error": "예시 이미지 폴더 정리에 실패했습니다: {message}"
},
"fetchMissingLicenses": {
"label": "Refresh license metadata",
"loading": "Refreshing license metadata for {typePlural}...",
"success": "Updated license metadata for {count} {typePlural}",
"none": "All {typePlural} already have license metadata",
"error": "Failed to refresh license metadata for {typePlural}: {message}"
}
},
"header": {
@@ -188,6 +195,10 @@
"title": "모델 필터",
"baseModel": "베이스 모델",
"modelTags": "태그 (상위 20개)",
"modelTypes": "Model Types",
"license": "라이선스",
"noCreditRequired": "크레딧 표기 없음",
"allowSellingGeneratedContent": "판매 허용",
"clearAll": "모든 필터 지우기"
},
"theme": {
@@ -217,12 +228,19 @@
"videoSettings": "비디오 설정",
"layoutSettings": "레이아웃 설정",
"folderSettings": "폴더 설정",
"priorityTags": "우선순위 태그",
"downloadPathTemplates": "다운로드 경로 템플릿",
"exampleImages": "예시 이미지",
"updateFlags": "업데이트 표시",
"autoOrganize": "Auto-organize",
"misc": "기타",
"metadataArchive": "메타데이터 아카이브 데이터베이스",
"proxySettings": "프록시 설정",
"priorityTags": "우선순위 태그"
"storageLocation": "설정 위치",
"proxySettings": "프록시 설정"
},
"storage": {
"locationLabel": "휴대용 모드",
"locationHelp": "활성화하면 settings.json을 리포지토리에 유지하고, 비활성화하면 사용자 구성 디렉터리에 저장합니다."
},
"contentFiltering": {
"blurNsfwContent": "NSFW 콘텐츠 블러 처리",
@@ -234,6 +252,15 @@
"autoplayOnHover": "호버 시 비디오 자동 재생",
"autoplayOnHoverHelp": "마우스를 올렸을 때만 비디오 미리보기를 재생합니다"
},
"autoOrganizeExclusions": {
"label": "자동 정리 제외 항목",
"placeholder": "예: curated/*, */backups/*; *_temp.safetensors",
"help": "이 와일드카드 패턴과 일치하는 파일 이동을 건너뜁니다. 여러 패턴은 쉼표 또는 세미콜론으로 구분하십시오.",
"validation": {
"noPatterns": "쉼표 또는 세미콜론으로 구분된 최소한 하나의 패턴을 입력하십시오.",
"saveFailed": "제외 항목을 저장할 수 없습니다: {message}"
}
},
"layoutSettings": {
"displayDensity": "표시 밀도",
"displayDensityOptions": {
@@ -282,6 +309,26 @@
"defaultEmbeddingRootHelp": "다운로드, 가져오기 및 이동을 위한 기본 Embedding 루트 디렉토리를 설정합니다",
"noDefault": "기본값 없음"
},
"priorityTags": {
"title": "우선순위 태그",
"description": "모델 유형별 태그 우선순위를 사용자 지정합니다(예: character, concept, style(toon|toon_style)).",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "우선순위 태그 도움말 열기",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "체크포인트",
"embedding": "임베딩"
},
"saveSuccess": "우선순위 태그가 업데이트되었습니다.",
"saveError": "우선순위 태그를 업데이트하지 못했습니다.",
"loadingSuggestions": "추천을 불러오는 중...",
"validation": {
"missingClosingParen": "{index}번째 항목에 닫는 괄호가 없습니다.",
"missingCanonical": "{index}번째 항목에는 정식 태그 이름이 포함되어야 합니다.",
"duplicateCanonical": "정식 태그 \"{tag}\"가 여러 번 나타납니다.",
"unknown": "잘못된 우선순위 태그 구성입니다."
}
},
"downloadPathTemplates": {
"title": "다운로드 경로 템플릿",
"help": "Civitai에서 다운로드할 때 다양한 모델 유형의 폴더 구조를 구성합니다.",
@@ -329,6 +376,14 @@
"download": "다운로드",
"restartRequired": "재시작 필요"
},
"updateFlagStrategy": {
"label": "업데이트 표시 전략",
"help": "새 릴리스가 로컬 파일과 동일한 베이스 모델을 공유할 때만 업데이트 배지를 표시할지, 또는 해당 모델에 사용 가능한 새 버전이 있으면 항상 표시할지 결정합니다.",
"options": {
"sameBase": "베이스 모델로 업데이트 일치",
"any": "사용 가능한 모든 업데이트 표시"
}
},
"misc": {
"includeTriggerWords": "LoRA 문법에 트리거 단어 포함",
"includeTriggerWordsHelp": "LoRA 문법을 클립보드에 복사할 때 학습된 트리거 단어를 포함합니다"
@@ -374,26 +429,6 @@
"proxyPassword": "비밀번호 (선택사항)",
"proxyPasswordPlaceholder": "password",
"proxyPasswordHelp": "프록시 인증에 필요한 비밀번호 (필요한 경우)"
},
"priorityTags": {
"title": "우선순위 태그",
"description": "모델 유형별 태그 우선순위를 사용자 지정합니다(예: character, concept, style(toon|toon_style)).",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "우선순위 태그 도움말 열기",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "체크포인트",
"embedding": "임베딩"
},
"saveSuccess": "우선순위 태그가 업데이트되었습니다.",
"saveError": "우선순위 태그를 업데이트하지 못했습니다.",
"loadingSuggestions": "추천을 불러오는 중...",
"validation": {
"missingClosingParen": "{index}번째 항목에 닫는 괄호가 없습니다.",
"missingCanonical": "{index}번째 항목에는 정식 태그 이름이 포함되어야 합니다.",
"duplicateCanonical": "정식 태그 \"{tag}\"가 여러 번 나타납니다.",
"unknown": "잘못된 우선순위 태그 구성입니다."
}
}
},
"loras": {
@@ -471,6 +506,7 @@
},
"contextMenu": {
"refreshMetadata": "Civitai 데이터 새로고침",
"checkUpdates": "업데이트 확인",
"relinkCivitai": "Civitai에 다시 연결",
"copySyntax": "LoRA 문법 복사",
"copyFilename": "모델 파일명 복사",
@@ -492,6 +528,9 @@
},
"recipes": {
"title": "LoRA 레시피",
"actions": {
"sendCheckpoint": "ComfyUI로 보내기"
},
"controls": {
"import": {
"action": "가져오기",
@@ -875,6 +914,16 @@
"recipes": "레시피",
"versions": "버전"
},
"license": {
"noImageSell": "No selling generated content",
"noRentCivit": "No Civitai generation",
"noRent": "No generation services",
"noSell": "No selling models",
"creditRequired": "제작자 크레딧 필요",
"noDerivatives": "공유 병합 불가",
"noReLicense": "동일한 권한 필요",
"restrictionsLabel": "라이선스 제한"
},
"loading": {
"exampleImages": "예시 이미지 로딩 중...",
"description": "모델 설명 로딩 중...",
@@ -908,6 +957,18 @@
"viewLocalVersions": "로컬 버전 모두 보기",
"viewLocalTooltip": "곧 제공 예정"
},
"filters": {
"label": "기본 필터",
"state": {
"showAll": "모든 버전",
"showSameBase": "같은 베이스"
},
"tooltip": {
"showAllVersions": "모든 버전을 표시하도록 전환",
"showSameBaseVersions": "같은 베이스 모델 버전만 표시하도록 전환"
},
"empty": "현재 베이스 모델 필터와 일치하는 버전이 없습니다."
},
"empty": "이 모델에는 아직 버전 기록이 없습니다.",
"error": "버전을 불러오지 못했습니다.",
"missingModelId": "이 모델에는 Civitai 모델 ID가 없습니다.",
@@ -1196,6 +1257,9 @@
"cannotSend": "레시피를 전송할 수 없습니다: 레시피 ID 누락",
"sendFailed": "레시피를 워크플로로 전송하는데 실패했습니다",
"sendError": "레시피를 워크플로로 전송하는 중 오류",
"missingCheckpointPath": "체크포인트 경로를 사용할 수 없습니다",
"missingCheckpointInfo": "체크포인트 정보가 부족합니다",
"downloadCheckpointFailed": "체크포인트 다운로드 실패: {message}",
"cannotDelete": "레시피를 삭제할 수 없습니다: 레시피 ID 누락",
"deleteConfirmationError": "삭제 확인 표시 오류",
"deletedSuccessfully": "레시피가 성공적으로 삭제되었습니다",
@@ -1302,7 +1366,7 @@
},
"triggerWords": {
"loadFailed": "학습된 단어를 로딩할 수 없습니다",
"tooLong": "트리거 단어는 30단어를 초과할 수 없습니다",
"tooLong": "트리거 단어는 100단어를 초과할 수 없습니다",
"tooMany": "최대 30개의 트리거 단어만 허용됩니다",
"alreadyExists": "이 트리거 단어는 이미 존재합니다",
"updateSuccess": "트리거 단어가 성공적으로 업데이트되었습니다",

View File

@@ -152,6 +152,13 @@
"none": "Нет папок с примерами изображений, требующих очистки",
"partial": "Очистка завершена, пропущено {failures} папок",
"error": "Не удалось очистить папки с примерами изображений: {message}"
},
"fetchMissingLicenses": {
"label": "Refresh license metadata",
"loading": "Refreshing license metadata for {typePlural}...",
"success": "Updated license metadata for {count} {typePlural}",
"none": "All {typePlural} already have license metadata",
"error": "Failed to refresh license metadata for {typePlural}: {message}"
}
},
"header": {
@@ -188,6 +195,10 @@
"title": "Фильтр моделей",
"baseModel": "Базовая модель",
"modelTags": "Теги (Топ 20)",
"modelTypes": "Model Types",
"license": "Лицензия",
"noCreditRequired": "Без указания авторства",
"allowSellingGeneratedContent": "Продажа разрешена",
"clearAll": "Очистить все фильтры"
},
"theme": {
@@ -217,12 +228,19 @@
"videoSettings": "Настройки видео",
"layoutSettings": "Настройки макета",
"folderSettings": "Настройки папок",
"priorityTags": "Приоритетные теги",
"downloadPathTemplates": "Шаблоны путей загрузки",
"exampleImages": "Примеры изображений",
"updateFlags": "Метки обновлений",
"autoOrganize": "Auto-organize",
"misc": "Разное",
"metadataArchive": "Архив метаданных",
"proxySettings": "Настройки прокси",
"priorityTags": "Приоритетные теги"
"storageLocation": "Расположение настроек",
"proxySettings": "Настройки прокси"
},
"storage": {
"locationLabel": "Портативный режим",
"locationHelp": "Включите, чтобы хранить settings.json в репозитории; выключите, чтобы сохранить его в папке конфигурации пользователя."
},
"contentFiltering": {
"blurNsfwContent": "Размывать NSFW контент",
@@ -234,6 +252,15 @@
"autoplayOnHover": "Автовоспроизведение видео при наведении",
"autoplayOnHoverHelp": "Воспроизводить превью видео только при наведении курсора"
},
"autoOrganizeExclusions": {
"label": "Исключения автосортировки",
"placeholder": "Пример: curated/*, */backups/*; *_temp.safetensors",
"help": "Пропускать перемещение файлов, соответствующих этим шаблонам. Разделяйте несколько шаблонов запятыми или точками с запятой.",
"validation": {
"noPatterns": "Введите хотя бы один шаблон, разделенный запятыми или точками с запятой.",
"saveFailed": "Не удалось сохранить исключения: {message}"
}
},
"layoutSettings": {
"displayDensity": "Плотность отображения",
"displayDensityOptions": {
@@ -282,6 +309,26 @@
"defaultEmbeddingRootHelp": "Установить корневую папку embedding по умолчанию для загрузок, импорта и перемещений",
"noDefault": "Не задано"
},
"priorityTags": {
"title": "Приоритетные теги",
"description": "Настройте порядок приоритетов тегов для каждого типа моделей (например, character, concept, style(toon|toon_style)).",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "Открыть справку по приоритетным тегам",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Чекпойнт",
"embedding": "Эмбеддинг"
},
"saveSuccess": "Приоритетные теги обновлены.",
"saveError": "Не удалось обновить приоритетные теги.",
"loadingSuggestions": "Загрузка подсказок...",
"validation": {
"missingClosingParen": "В записи {index} отсутствует закрывающая скобка.",
"missingCanonical": "Запись {index} должна содержать каноническое имя тега.",
"duplicateCanonical": "Канонический тег \"{tag}\" встречается более одного раза.",
"unknown": "Недопустимая конфигурация приоритетных тегов."
}
},
"downloadPathTemplates": {
"title": "Шаблоны путей загрузки",
"help": "Настройте структуру папок для разных типов моделей при загрузке с Civitai.",
@@ -329,6 +376,14 @@
"download": "Загрузить",
"restartRequired": "Требует перезапуска"
},
"updateFlagStrategy": {
"label": "Стратегия меток обновлений",
"help": "Выберите, отображать ли значки обновления только когда новая версия имеет тот же базовый модель, что и локальные файлы, или всегда при наличии любого нового релиза для этой модели.",
"options": {
"sameBase": "Совпадение обновлений по базовой модели",
"any": "Отмечать любые доступные обновления"
}
},
"misc": {
"includeTriggerWords": "Включать триггерные слова в синтаксис LoRA",
"includeTriggerWordsHelp": "Включать обученные триггерные слова при копировании синтаксиса LoRA в буфер обмена"
@@ -374,26 +429,6 @@
"proxyPassword": "Пароль (необязательно)",
"proxyPasswordPlaceholder": "пароль",
"proxyPasswordHelp": "Пароль для аутентификации на прокси (если требуется)"
},
"priorityTags": {
"title": "Приоритетные теги",
"description": "Настройте порядок приоритетов тегов для каждого типа моделей (например, character, concept, style(toon|toon_style)).",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "Открыть справку по приоритетным тегам",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Чекпойнт",
"embedding": "Эмбеддинг"
},
"saveSuccess": "Приоритетные теги обновлены.",
"saveError": "Не удалось обновить приоритетные теги.",
"loadingSuggestions": "Загрузка подсказок...",
"validation": {
"missingClosingParen": "В записи {index} отсутствует закрывающая скобка.",
"missingCanonical": "Запись {index} должна содержать каноническое имя тега.",
"duplicateCanonical": "Канонический тег \"{tag}\" встречается более одного раза.",
"unknown": "Недопустимая конфигурация приоритетных тегов."
}
}
},
"loras": {
@@ -471,6 +506,7 @@
},
"contextMenu": {
"refreshMetadata": "Обновить данные Civitai",
"checkUpdates": "Проверить обновления",
"relinkCivitai": "Пересвязать с Civitai",
"copySyntax": "Копировать синтаксис LoRA",
"copyFilename": "Копировать имя файла модели",
@@ -492,6 +528,9 @@
},
"recipes": {
"title": "Рецепты LoRA",
"actions": {
"sendCheckpoint": "Отправить в ComfyUI"
},
"controls": {
"import": {
"action": "Импортировать",
@@ -875,6 +914,16 @@
"recipes": "Рецепты",
"versions": "Версии"
},
"license": {
"noImageSell": "No selling generated content",
"noRentCivit": "No Civitai generation",
"noRent": "No generation services",
"noSell": "No selling models",
"creditRequired": "Требуется указание авторства",
"noDerivatives": "Запрет на совместное использование производных работ",
"noReLicense": "Требуются те же права",
"restrictionsLabel": "Лицензионные ограничения"
},
"loading": {
"exampleImages": "Загрузка примеров изображений...",
"description": "Загрузка описания модели...",
@@ -908,6 +957,18 @@
"viewLocalVersions": "Показать все локальные версии",
"viewLocalTooltip": "Скоро появится"
},
"filters": {
"label": "Фильтр по базе",
"state": {
"showAll": "Все версии",
"showSameBase": "Тот же базовый"
},
"tooltip": {
"showAllVersions": "Переключиться на отображение всех версий",
"showSameBaseVersions": "Переключиться на отображение только версий с тем же базовым"
},
"empty": "Нет версий, соответствующих текущему фильтру базовой модели."
},
"empty": "Для этой модели пока нет истории версий.",
"error": "Не удалось загрузить версии.",
"missingModelId": "У этой модели отсутствует идентификатор модели Civitai.",
@@ -1196,6 +1257,9 @@
"cannotSend": "Невозможно отправить рецепт: отсутствует ID рецепта",
"sendFailed": "Не удалось отправить рецепт в workflow",
"sendError": "Ошибка отправки рецепта в workflow",
"missingCheckpointPath": "Путь к чекпойнту недоступен",
"missingCheckpointInfo": "Отсутствуют данные о чекпойнте",
"downloadCheckpointFailed": "Не удалось скачать чекпойнт: {message}",
"cannotDelete": "Невозможно удалить рецепт: отсутствует ID рецепта",
"deleteConfirmationError": "Ошибка отображения подтверждения удаления",
"deletedSuccessfully": "Рецепт успешно удален",
@@ -1302,7 +1366,7 @@
},
"triggerWords": {
"loadFailed": "Не удалось загрузить обученные слова",
"tooLong": "Триггерное слово не должно превышать 30 слов",
"tooLong": "Триггерное слово не должно превышать 100 слов",
"tooMany": "Максимум 30 триггерных слов разрешено",
"alreadyExists": "Это триггерное слово уже существует",
"updateSuccess": "Триггерные слова успешно обновлены",

View File

@@ -152,6 +152,13 @@
"none": "没有需要清理的示例图片文件夹",
"partial": "清理完成,有 {failures} 个文件夹跳过",
"error": "清理示例图片文件夹失败:{message}"
},
"fetchMissingLicenses": {
"label": "Refresh license metadata",
"loading": "Refreshing license metadata for {typePlural}...",
"success": "Updated license metadata for {count} {typePlural}",
"none": "All {typePlural} already have license metadata",
"error": "Failed to refresh license metadata for {typePlural}: {message}"
}
},
"header": {
@@ -188,6 +195,10 @@
"title": "筛选模型",
"baseModel": "基础模型",
"modelTags": "标签前20",
"modelTypes": "Model Types",
"license": "许可证",
"noCreditRequired": "无需署名",
"allowSellingGeneratedContent": "允许销售",
"clearAll": "清除所有筛选"
},
"theme": {
@@ -217,12 +228,19 @@
"videoSettings": "视频设置",
"layoutSettings": "布局设置",
"folderSettings": "文件夹设置",
"priorityTags": "优先标签",
"downloadPathTemplates": "下载路径模板",
"exampleImages": "示例图片",
"updateFlags": "更新标记",
"autoOrganize": "Auto-organize",
"misc": "其他",
"metadataArchive": "元数据归档数据库",
"proxySettings": "代理设置",
"priorityTags": "优先标签"
"storageLocation": "设置位置",
"proxySettings": "代理设置"
},
"storage": {
"locationLabel": "便携模式",
"locationHelp": "开启可将 settings.json 保存在仓库中;关闭则保存在用户配置目录。"
},
"contentFiltering": {
"blurNsfwContent": "模糊 NSFW 内容",
@@ -234,6 +252,15 @@
"autoplayOnHover": "悬停时自动播放视频",
"autoplayOnHoverHelp": "仅在悬停时播放视频预览"
},
"autoOrganizeExclusions": {
"label": "自动整理排除项",
"placeholder": "示例: curated/*, */backups/*; *_temp.safetensors",
"help": "跳过与这些通配符模式匹配的文件。多个模式用逗号或分号分隔。",
"validation": {
"noPatterns": "请输入至少一个用逗号或分号分隔的模式。",
"saveFailed": "无法保存排除项:{message}"
}
},
"layoutSettings": {
"displayDensity": "显示密度",
"displayDensityOptions": {
@@ -282,6 +309,26 @@
"defaultEmbeddingRootHelp": "设置下载、导入和移动时的默认 Embedding 根目录",
"noDefault": "无默认"
},
"priorityTags": {
"title": "优先标签",
"description": "为每种模型类型自定义标签优先级顺序 (例如: character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "打开优先标签帮助",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Checkpoint",
"embedding": "Embedding"
},
"saveSuccess": "优先标签已更新。",
"saveError": "优先标签更新失败。",
"loadingSuggestions": "正在加载建议...",
"validation": {
"missingClosingParen": "条目 {index} 缺少右括号。",
"missingCanonical": "条目 {index} 必须包含规范标签名称。",
"duplicateCanonical": "规范标签 \"{tag}\" 出现多次。",
"unknown": "优先标签配置无效。"
}
},
"downloadPathTemplates": {
"title": "下载路径模板",
"help": "配置从 Civitai 下载不同模型类型的文件夹结构。",
@@ -329,6 +376,14 @@
"download": "下载",
"restartRequired": "需要重启"
},
"updateFlagStrategy": {
"label": "更新标记策略",
"help": "决定更新徽章是否仅在新版本与本地文件共享相同基础模型时显示,或只要该模型有任何更新版本就显示。",
"options": {
"sameBase": "按基础模型匹配更新",
"any": "显示任何可用更新"
}
},
"misc": {
"includeTriggerWords": "复制 LoRA 语法时包含触发词",
"includeTriggerWordsHelp": "复制 LoRA 语法到剪贴板时包含训练触发词"
@@ -374,26 +429,6 @@
"proxyPassword": "密码 (可选)",
"proxyPasswordPlaceholder": "密码",
"proxyPasswordHelp": "代理认证的密码 (如果需要)"
},
"priorityTags": {
"title": "优先标签",
"description": "为每种模型类型自定义标签优先级顺序 (例如: character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "打开优先标签帮助",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Checkpoint",
"embedding": "Embedding"
},
"saveSuccess": "优先标签已更新。",
"saveError": "优先标签更新失败。",
"loadingSuggestions": "正在加载建议...",
"validation": {
"missingClosingParen": "条目 {index} 缺少右括号。",
"missingCanonical": "条目 {index} 必须包含规范标签名称。",
"duplicateCanonical": "规范标签 \"{tag}\" 出现多次。",
"unknown": "优先标签配置无效。"
}
}
},
"loras": {
@@ -471,6 +506,7 @@
},
"contextMenu": {
"refreshMetadata": "刷新 Civitai 数据",
"checkUpdates": "检查更新",
"relinkCivitai": "重新关联到 Civitai",
"copySyntax": "复制 LoRA 语法",
"copyFilename": "复制模型文件名",
@@ -492,6 +528,9 @@
},
"recipes": {
"title": "LoRA 配方",
"actions": {
"sendCheckpoint": "发送到 ComfyUI"
},
"controls": {
"import": {
"action": "导入",
@@ -875,6 +914,16 @@
"recipes": "配方",
"versions": "版本"
},
"license": {
"noImageSell": "No selling generated content",
"noRentCivit": "No Civitai generation",
"noRent": "No generation services",
"noSell": "No selling models",
"creditRequired": "需要创作者署名",
"noDerivatives": "禁止分享合并作品",
"noReLicense": "需要相同权限",
"restrictionsLabel": "许可证限制"
},
"loading": {
"exampleImages": "正在加载示例图片...",
"description": "正在加载模型描述...",
@@ -908,6 +957,18 @@
"viewLocalVersions": "查看所有本地版本",
"viewLocalTooltip": "敬请期待"
},
"filters": {
"label": "基础筛选",
"state": {
"showAll": "全部版本",
"showSameBase": "相同基模型"
},
"tooltip": {
"showAllVersions": "切换为显示所有版本",
"showSameBaseVersions": "仅显示与当前基模型匹配的版本"
},
"empty": "没有与当前基模型筛选匹配的版本。"
},
"empty": "该模型还没有版本历史。",
"error": "加载版本失败。",
"missingModelId": "该模型缺少 Civitai 模型 ID。",
@@ -1196,6 +1257,9 @@
"cannotSend": "无法发送配方:缺少配方 ID",
"sendFailed": "发送配方到工作流失败",
"sendError": "发送配方到工作流出错",
"missingCheckpointPath": "缺少检查点路径",
"missingCheckpointInfo": "缺少检查点信息",
"downloadCheckpointFailed": "下载检查点失败:{message}",
"cannotDelete": "无法删除配方:缺少配方 ID",
"deleteConfirmationError": "显示删除确认出错",
"deletedSuccessfully": "配方删除成功",
@@ -1302,7 +1366,7 @@
},
"triggerWords": {
"loadFailed": "无法加载训练词",
"tooLong": "触发词不能超过30个词",
"tooLong": "触发词不能超过100个词",
"tooMany": "最多允许30个触发词",
"alreadyExists": "该触发词已存在",
"updateSuccess": "触发词更新成功",

View File

@@ -152,6 +152,13 @@
"none": "沒有需要清理的範例圖片資料夾",
"partial": "清理完成,有 {failures} 個資料夾略過",
"error": "清理範例圖片資料夾失敗:{message}"
},
"fetchMissingLicenses": {
"label": "Refresh license metadata",
"loading": "Refreshing license metadata for {typePlural}...",
"success": "Updated license metadata for {count} {typePlural}",
"none": "All {typePlural} already have license metadata",
"error": "Failed to refresh license metadata for {typePlural}: {message}"
}
},
"header": {
@@ -188,6 +195,10 @@
"title": "篩選模型",
"baseModel": "基礎模型",
"modelTags": "標籤(前 20",
"modelTypes": "Model Types",
"license": "授權",
"noCreditRequired": "無需署名",
"allowSellingGeneratedContent": "允許銷售",
"clearAll": "清除所有篩選"
},
"theme": {
@@ -217,12 +228,19 @@
"videoSettings": "影片設定",
"layoutSettings": "版面設定",
"folderSettings": "資料夾設定",
"priorityTags": "優先標籤",
"downloadPathTemplates": "下載路徑範本",
"exampleImages": "範例圖片",
"updateFlags": "更新標記",
"autoOrganize": "Auto-organize",
"misc": "其他",
"metadataArchive": "中繼資料封存資料庫",
"proxySettings": "代理設定",
"priorityTags": "優先標籤"
"storageLocation": "設定位置",
"proxySettings": "代理設定"
},
"storage": {
"locationLabel": "可攜式模式",
"locationHelp": "啟用可將 settings.json 保存在儲存庫中;停用則保存在使用者設定目錄。"
},
"contentFiltering": {
"blurNsfwContent": "模糊 NSFW 內容",
@@ -234,6 +252,15 @@
"autoplayOnHover": "滑鼠懸停自動播放影片",
"autoplayOnHoverHelp": "僅在滑鼠懸停時播放影片預覽"
},
"autoOrganizeExclusions": {
"label": "自動整理排除項目",
"placeholder": "範例: curated/*, */backups/*; *_temp.safetensors",
"help": "跳過符合這些萬用字元模式的檔案。多個模式請用逗號或分號分隔。",
"validation": {
"noPatterns": "請輸入至少一個以逗號或分號分隔的模式。",
"saveFailed": "無法儲存排除項目:{message}"
}
},
"layoutSettings": {
"displayDensity": "顯示密度",
"displayDensityOptions": {
@@ -282,6 +309,26 @@
"defaultEmbeddingRootHelp": "設定下載、匯入和移動時的預設 Embedding 根目錄",
"noDefault": "未設定預設"
},
"priorityTags": {
"title": "優先標籤",
"description": "為每種模型類型自訂標籤的優先順序 (例如: character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "開啟優先標籤說明",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Checkpoint",
"embedding": "Embedding"
},
"saveSuccess": "優先標籤已更新。",
"saveError": "更新優先標籤失敗。",
"loadingSuggestions": "正在載入建議...",
"validation": {
"missingClosingParen": "項目 {index} 缺少右括號。",
"missingCanonical": "項目 {index} 必須包含正規標籤名稱。",
"duplicateCanonical": "正規標籤 \"{tag}\" 出現多於一次。",
"unknown": "優先標籤設定無效。"
}
},
"downloadPathTemplates": {
"title": "下載路徑範本",
"help": "設定從 Civitai 下載時不同模型類型的資料夾結構。",
@@ -329,6 +376,14 @@
"download": "下載",
"restartRequired": "需要重新啟動"
},
"updateFlagStrategy": {
"label": "更新標記策略",
"help": "決定更新徽章是否僅在新版本與本地檔案共享相同基礎模型時顯示,或只要該模型有任何更新版本就顯示。",
"options": {
"sameBase": "依基礎模型匹配更新",
"any": "顯示任何可用更新"
}
},
"misc": {
"includeTriggerWords": "在 LoRA 語法中包含觸發詞",
"includeTriggerWordsHelp": "複製 LoRA 語法到剪貼簿時包含訓練觸發詞"
@@ -374,26 +429,6 @@
"proxyPassword": "密碼(選填)",
"proxyPasswordPlaceholder": "password",
"proxyPasswordHelp": "代理驗證所需的密碼(如有需要)"
},
"priorityTags": {
"title": "優先標籤",
"description": "為每種模型類型自訂標籤的優先順序 (例如: character, concept, style(toon|toon_style))",
"placeholder": "character, concept, style(toon|toon_style)",
"helpLinkLabel": "開啟優先標籤說明",
"modelTypes": {
"lora": "LoRA",
"checkpoint": "Checkpoint",
"embedding": "Embedding"
},
"saveSuccess": "優先標籤已更新。",
"saveError": "更新優先標籤失敗。",
"loadingSuggestions": "正在載入建議...",
"validation": {
"missingClosingParen": "項目 {index} 缺少右括號。",
"missingCanonical": "項目 {index} 必須包含正規標籤名稱。",
"duplicateCanonical": "正規標籤 \"{tag}\" 出現多於一次。",
"unknown": "優先標籤設定無效。"
}
}
},
"loras": {
@@ -471,6 +506,7 @@
},
"contextMenu": {
"refreshMetadata": "刷新 Civitai 資料",
"checkUpdates": "檢查更新",
"relinkCivitai": "重新連結 Civitai",
"copySyntax": "複製 LoRA 語法",
"copyFilename": "複製模型檔名",
@@ -492,6 +528,9 @@
},
"recipes": {
"title": "LoRA 配方",
"actions": {
"sendCheckpoint": "傳送到 ComfyUI"
},
"controls": {
"import": {
"action": "匯入",
@@ -875,6 +914,16 @@
"recipes": "配方",
"versions": "版本"
},
"license": {
"noImageSell": "No selling generated content",
"noRentCivit": "No Civitai generation",
"noRent": "No generation services",
"noSell": "No selling models",
"creditRequired": "需要創作者標示",
"noDerivatives": "禁止分享合併作品",
"noReLicense": "需要相同授權",
"restrictionsLabel": "授權限制"
},
"loading": {
"exampleImages": "載入範例圖片中...",
"description": "載入模型描述中...",
@@ -908,6 +957,18 @@
"viewLocalVersions": "檢視所有本地版本",
"viewLocalTooltip": "敬請期待"
},
"filters": {
"label": "基礎篩選",
"state": {
"showAll": "所有版本",
"showSameBase": "相同基礎模型"
},
"tooltip": {
"showAllVersions": "切換為顯示所有版本",
"showSameBaseVersions": "僅顯示與目前基礎模型相符的版本"
},
"empty": "沒有符合目前基礎模型篩選的版本。"
},
"empty": "此模型尚無版本歷史。",
"error": "載入版本失敗。",
"missingModelId": "此模型缺少 Civitai 模型 ID。",
@@ -1196,6 +1257,9 @@
"cannotSend": "無法傳送配方:缺少配方 ID",
"sendFailed": "傳送配方到工作流失敗",
"sendError": "傳送配方到工作流錯誤",
"missingCheckpointPath": "缺少檢查點路徑",
"missingCheckpointInfo": "缺少檢查點資訊",
"downloadCheckpointFailed": "下載檢查點失敗:{message}",
"cannotDelete": "無法刪除配方:缺少配方 ID",
"deleteConfirmationError": "顯示刪除確認時發生錯誤",
"deletedSuccessfully": "配方已成功刪除",
@@ -1302,7 +1366,7 @@
},
"triggerWords": {
"loadFailed": "無法載入訓練詞",
"tooLong": "觸發詞不可超過 30 個字",
"tooLong": "觸發詞不可超過 100 個字",
"tooMany": "最多允許 30 個觸發詞",
"alreadyExists": "此觸發詞已存在",
"updateSuccess": "觸發詞已更新",

3
package-lock.json generated
View File

@@ -114,6 +114,7 @@
}
],
"license": "MIT",
"peer": true,
"engines": {
"node": ">=18"
},
@@ -137,6 +138,7 @@
}
],
"license": "MIT",
"peer": true,
"engines": {
"node": ">=18"
}
@@ -1611,6 +1613,7 @@
"integrity": "sha512-MyL55p3Ut3cXbeBEG7Hcv0mVM8pp8PBNWxRqchZnSfAiES1v1mRnMeFfaHWIPULpwsYfvO+ZmMZz5tGCnjzDUQ==",
"dev": true,
"license": "MIT",
"peer": true,
"dependencies": {
"cssstyle": "^4.0.1",
"data-urls": "^5.0.0",

View File

@@ -196,9 +196,11 @@ class MetadataRegistry:
node_metadata[category] = {}
node_metadata[category][node_id] = current_metadata[category][node_id]
# Save to cache if we have any metadata for this node
# Save new metadata or clear stale cache entries when metadata is empty
if any(node_metadata.values()):
self.node_cache[cache_key] = node_metadata
else:
self.node_cache.pop(cache_key, None)
def clear_unused_cache(self):
"""Clean up node_cache entries that are no longer in use"""

View File

@@ -72,6 +72,18 @@ class GGUFLoaderExtractor(NodeMetadataExtractor):
model_name = inputs.get("gguf_name")
_store_checkpoint_metadata(metadata, node_id, model_name)
class KJNodesModelLoaderExtractor(NodeMetadataExtractor):
"""Extract metadata from KJNodes loaders that expose `model_name`."""
@staticmethod
def extract(node_id, inputs, outputs, metadata):
if not inputs or "model_name" not in inputs:
return
model_name = inputs.get("model_name")
_store_checkpoint_metadata(metadata, node_id, model_name)
class TSCCheckpointLoaderExtractor(NodeMetadataExtractor):
@staticmethod
def extract(node_id, inputs, outputs, metadata):
@@ -682,6 +694,7 @@ NODE_EXTRACTORS = {
"KSamplerAdvancedBasicPipe": KSamplerAdvancedBasicPipeExtractor, # comfyui-impact-pack
"KSampler_inspire_pipe": KSamplerBasicPipeExtractor, # comfyui-inspire-pack
"KSamplerAdvanced_inspire_pipe": KSamplerAdvancedBasicPipeExtractor, # comfyui-inspire-pack
"KSampler_inspire": SamplerExtractor, # comfyui-inspire-pack
# Sampling Selectors
"KSamplerSelect": KSamplerSelectExtractor, # Add KSamplerSelect
"BasicScheduler": BasicSchedulerExtractor, # Add BasicScheduler
@@ -695,6 +708,9 @@ NODE_EXTRACTORS = {
"NunchakuQwenImageDiTLoader": NunchakuQwenImageDiTLoaderExtractor, # ComfyUI-Nunchaku
"LoaderGGUF": GGUFLoaderExtractor, # calcuis gguf
"LoaderGGUFAdvanced": GGUFLoaderExtractor, # calcuis gguf
"GGUFLoaderKJ": KJNodesModelLoaderExtractor, # KJNodes
"DiffusionModelLoaderKJ": KJNodesModelLoaderExtractor, # KJNodes
"CheckpointLoaderKJ": CheckpointLoaderExtractor, # KJNodes
"UNETLoader": UNETLoaderExtractor, # Updated to use dedicated extractor
"UnetLoaderGGUF": UNETLoaderExtractor, # Updated to use dedicated extractor
"LoraLoader": LoraLoaderExtractor,

View File

@@ -273,9 +273,15 @@ class SaveImage:
length = int(parts[1])
prompt = prompt[:length]
filename = filename.replace(segment, prompt.strip())
elif key == "model" and 'checkpoint' in metadata_dict:
model = metadata_dict.get('checkpoint', '')
model = os.path.splitext(os.path.basename(model))[0]
elif key == "model":
model_value = metadata_dict.get('checkpoint')
if isinstance(model_value, (bytes, os.PathLike)):
model_value = str(model_value)
if not isinstance(model_value, str) or not model_value:
model = "model_unavailable"
else:
model = os.path.splitext(os.path.basename(model_value))[0]
if len(parts) >= 2:
length = int(parts[1])
model = model[:length]
@@ -442,4 +448,4 @@ class SaveImage:
add_counter_to_filename
)
return (images,)
return (images,)

View File

@@ -23,6 +23,10 @@ class TriggerWordToggle:
"default": True,
"tooltip": "Sets the default initial state (active or inactive) when trigger words are added."
}),
"allow_strength_adjustment": ("BOOLEAN", {
"default": False,
"tooltip": "Enable mouse wheel adjustment of each trigger word's strength."
}),
},
"optional": FlexibleOptionalInputType(any_type),
"hidden": {
@@ -47,7 +51,14 @@ class TriggerWordToggle:
else:
return data
def process_trigger_words(self, id, group_mode, default_active, **kwargs):
def process_trigger_words(
self,
id,
group_mode,
default_active,
allow_strength_adjustment=False,
**kwargs,
):
# Handle both old and new formats for trigger_words
trigger_words_data = self._get_toggle_data(kwargs, 'orinalMessage')
trigger_words = trigger_words_data if isinstance(trigger_words_data, str) else ""
@@ -63,27 +74,89 @@ class TriggerWordToggle:
trigger_data = json.loads(trigger_data)
# Create dictionaries to track active state of words or groups
active_state = {item['text']: item.get('active', False) for item in trigger_data}
# Also track strength values for each trigger word
active_state = {}
strength_map = {}
if group_mode:
# Split by two or more consecutive commas to get groups
groups = re.split(r',{2,}', trigger_words)
# Remove leading/trailing whitespace from each group
groups = [group.strip() for group in groups]
# Filter groups: keep those not in toggle_trigger_words or those that are active
filtered_groups = [group for group in groups if group not in active_state or active_state[group]]
if filtered_groups:
filtered_triggers = ', '.join(filtered_groups)
for item in trigger_data:
text = item['text']
active = item.get('active', False)
# Extract strength if it's in the format "(word:strength)"
strength_match = re.match(r'\((.+):([\d.]+)\)', text)
if strength_match:
original_word = strength_match.group(1).strip()
strength = float(strength_match.group(2))
active_state[original_word] = active
if allow_strength_adjustment:
strength_map[original_word] = strength
else:
filtered_triggers = ""
active_state[text.strip()] = active
if group_mode:
if isinstance(trigger_data, list):
filtered_groups = []
for item in trigger_data:
text = (item.get('text') or "").strip()
if not text:
continue
if item.get('active', False):
filtered_groups.append(text)
if filtered_groups:
filtered_triggers = ', '.join(filtered_groups)
else:
filtered_triggers = ""
else:
# Split by two or more consecutive commas to get groups
groups = re.split(r',{2,}', trigger_words)
# Remove leading/trailing whitespace from each group
groups = [group.strip() for group in groups]
# Process groups: keep those not in toggle_trigger_words or those that are active
filtered_groups = []
for group in groups:
# Check if this group contains any words that are in the active_state
group_words = [word.strip() for word in group.split(',')]
active_group_words = []
for word in group_words:
word_comparison = re.sub(r'\((.+):([\d.]+)\)', r'\1', word).strip()
if word_comparison not in active_state or active_state[word_comparison]:
active_group_words.append(
self._format_word_output(
word_comparison,
strength_map,
allow_strength_adjustment,
)
)
if active_group_words:
filtered_groups.append(', '.join(active_group_words))
if filtered_groups:
filtered_triggers = ', '.join(filtered_groups)
else:
filtered_triggers = ""
else:
# Original behavior for individual words mode
# Normal mode: split by commas and treat each word as a separate tag
original_words = [word.strip() for word in trigger_words.split(',')]
# Filter out empty strings
original_words = [word for word in original_words if word]
filtered_words = [word for word in original_words if word not in active_state or active_state[word]]
filtered_words = []
for word in original_words:
# Remove any existing strength formatting for comparison
word_comparison = re.sub(r'\((.+):([\d.]+)\)', r'\1', word).strip()
if word_comparison not in active_state or active_state[word_comparison]:
filtered_words.append(
self._format_word_output(
word_comparison,
strength_map,
allow_strength_adjustment,
)
)
if filtered_words:
filtered_triggers = ', '.join(filtered_words)
@@ -93,4 +166,9 @@ class TriggerWordToggle:
except Exception as e:
logger.error(f"Error processing trigger words: {e}")
return (filtered_triggers,)
return (filtered_triggers,)
def _format_word_output(self, base_word, strength_map, allow_strength_adjustment):
if allow_strength_adjustment and base_word in strength_map:
return f"({base_word}:{strength_map[base_word]:.2f})"
return base_word

View File

@@ -8,6 +8,7 @@ from typing import Dict, List, Any, Optional, Tuple
from abc import ABC, abstractmethod
from ..config import config
from ..utils.constants import VALID_LORA_TYPES
from ..utils.civitai_utils import rewrite_preview_url
logger = logging.getLogger(__name__)
@@ -78,7 +79,7 @@ class RecipeMetadataParser(ABC):
# Update model name if available
if 'model' in civitai_info and 'name' in civitai_info['model']:
lora_entry['name'] = civitai_info['model']['name']
lora_entry['id'] = civitai_info.get('id')
lora_entry['modelId'] = civitai_info.get('modelId')
@@ -88,7 +89,10 @@ class RecipeMetadataParser(ABC):
# Get thumbnail URL from first image
if 'images' in civitai_info and civitai_info['images']:
lora_entry['thumbnailUrl'] = civitai_info['images'][0].get('url', '')
image_url = civitai_info['images'][0].get('url')
if image_url:
rewritten_image_url, _ = rewrite_preview_url(image_url, media_type='image')
lora_entry['thumbnailUrl'] = rewritten_image_url or image_url
# Get base model
current_base_model = civitai_info.get('baseModel', '')
@@ -151,33 +155,59 @@ class RecipeMetadataParser(ABC):
Args:
checkpoint: The checkpoint entry to populate
civitai_info: The response from Civitai API
civitai_info: The response from Civitai API or a (data, error_msg) tuple
Returns:
The populated checkpoint dict
"""
try:
if civitai_info and civitai_info.get("error") != "Model not found":
# Update model name if available
if 'model' in civitai_info and 'name' in civitai_info['model']:
checkpoint['name'] = civitai_info['model']['name']
# Update version if available
if 'name' in civitai_info:
checkpoint['version'] = civitai_info.get('name', '')
# Get thumbnail URL from first image
if 'images' in civitai_info and civitai_info['images']:
checkpoint['thumbnailUrl'] = civitai_info['images'][0].get('url', '')
# Get base model
checkpoint['baseModel'] = civitai_info.get('baseModel', '')
# Get download URL
checkpoint['downloadUrl'] = civitai_info.get('downloadUrl', '')
else:
# Model not found or deleted
civitai_data, error_msg = (
(civitai_info, None)
if not isinstance(civitai_info, tuple)
else civitai_info
)
if not civitai_data or error_msg == "Model not found":
checkpoint['isDeleted'] = True
return checkpoint
if 'model' in civitai_data and 'name' in civitai_data['model']:
checkpoint['name'] = civitai_data['model']['name']
if 'name' in civitai_data:
checkpoint['version'] = civitai_data.get('name', '')
if 'images' in civitai_data and civitai_data['images']:
image_url = civitai_data['images'][0].get('url')
if image_url:
rewritten_image_url, _ = rewrite_preview_url(image_url, media_type='image')
checkpoint['thumbnailUrl'] = rewritten_image_url or image_url
checkpoint['baseModel'] = civitai_data.get('baseModel', '')
checkpoint['downloadUrl'] = civitai_data.get('downloadUrl', '')
checkpoint['modelId'] = civitai_data.get('modelId', checkpoint.get('modelId', 0))
if 'files' in civitai_data:
model_file = next(
(
file
for file in civitai_data.get('files', [])
if file.get('type') == 'Model'
),
None,
)
if model_file:
checkpoint['size'] = model_file.get('sizeKB', 0) * 1024
sha256 = model_file.get('hashes', {}).get('SHA256')
if sha256:
checkpoint['hash'] = sha256.lower()
file_name = model_file.get('name', '')
if file_name:
checkpoint['file_name'] = os.path.splitext(file_name)[0]
except Exception as e:
logger.error(f"Error populating checkpoint from Civitai info: {e}")

View File

@@ -1,6 +1,7 @@
"""Parser for Automatic1111 metadata format."""
import re
import os
import json
import logging
from typing import Dict, Any
@@ -22,6 +23,7 @@ class AutomaticMetadataParser(RecipeMetadataParser):
CIVITAI_METADATA_REGEX = r', Civitai metadata:\s*(\{.*?\})'
EXTRANETS_REGEX = r'<(lora|hypernet):([^:]+):(-?[0-9.]+)>'
MODEL_HASH_PATTERN = r'Model hash: ([a-zA-Z0-9]+)'
MODEL_NAME_PATTERN = r'Model: ([^,]+)'
VAE_HASH_PATTERN = r'VAE hash: ([a-zA-Z0-9]+)'
def is_metadata_matching(self, user_comment: str) -> bool:
@@ -115,6 +117,12 @@ class AutomaticMetadataParser(RecipeMetadataParser):
except json.JSONDecodeError:
logger.error("Error parsing hashes JSON")
# Pick up model hash from parsed hashes if available
if "hashes" in metadata and not metadata.get("model_hash"):
model_hash_from_hashes = metadata["hashes"].get("model")
if model_hash_from_hashes:
metadata["model_hash"] = model_hash_from_hashes
# Extract Lora hashes in alternative format
lora_hashes_match = re.search(self.LORA_HASHES_REGEX, params_section)
if not hashes_match and lora_hashes_match:
@@ -137,6 +145,17 @@ class AutomaticMetadataParser(RecipeMetadataParser):
params_section = params_section.replace(lora_hashes_match.group(0), '')
except Exception as e:
logger.error(f"Error parsing Lora hashes: {e}")
# Extract checkpoint model hash/name when provided outside Civitai resources
model_hash_match = re.search(self.MODEL_HASH_PATTERN, params_section)
if model_hash_match:
metadata["model_hash"] = model_hash_match.group(1).strip()
params_section = params_section.replace(model_hash_match.group(0), '')
model_name_match = re.search(self.MODEL_NAME_PATTERN, params_section)
if model_name_match:
metadata["model_name"] = model_name_match.group(1).strip()
params_section = params_section.replace(model_name_match.group(0), '')
# Extract basic parameters
param_pattern = r'([A-Za-z\s]+): ([^,]+)'
@@ -178,9 +197,10 @@ class AutomaticMetadataParser(RecipeMetadataParser):
metadata["gen_params"] = gen_params
# Extract LoRA information
# Extract LoRA and checkpoint information
loras = []
base_model_counts = {}
checkpoint = None
# First use Civitai resources if available (more reliable source)
if metadata.get("civitai_resources"):
@@ -202,6 +222,50 @@ class AutomaticMetadataParser(RecipeMetadataParser):
resource["modelVersionId"] = air_modelVersionId
# --- End added ---
if resource.get("type") == "checkpoint" and resource.get("modelVersionId"):
version_id = resource.get("modelVersionId")
version_id_str = str(version_id)
checkpoint_entry = {
'id': version_id,
'modelId': resource.get("modelId", 0),
'name': resource.get("modelName", "Unknown Checkpoint"),
'version': resource.get("modelVersionName", resource.get("versionName", "")),
'type': resource.get("type", "checkpoint"),
'existsLocally': False,
'localPath': None,
'file_name': resource.get("modelName", ""),
'hash': resource.get("hash", "") or "",
'thumbnailUrl': '/loras_static/images/no-preview.png',
'baseModel': '',
'size': 0,
'downloadUrl': '',
'isDeleted': False
}
if metadata_provider:
try:
civitai_info = await metadata_provider.get_model_version_info(version_id_str)
checkpoint_entry = await self.populate_checkpoint_from_civitai(
checkpoint_entry,
civitai_info
)
except Exception as e:
logger.error(
"Error fetching Civitai info for checkpoint version %s: %s",
version_id,
e,
)
# Prefer the first checkpoint found
if checkpoint_entry.get("baseModel"):
base_model_value = checkpoint_entry["baseModel"]
base_model_counts[base_model_value] = base_model_counts.get(base_model_value, 0) + 1
if checkpoint is None:
checkpoint = checkpoint_entry
continue
if resource.get("type") in ["lora", "lycoris", "hypernet"] and resource.get("modelVersionId"):
# Initialize lora entry
lora_entry = {
@@ -237,6 +301,52 @@ class AutomaticMetadataParser(RecipeMetadataParser):
loras.append(lora_entry)
# Fallback checkpoint parsing from generic "Model" and "Model hash" fields
if checkpoint is None:
model_hash = metadata.get("model_hash")
if not model_hash and metadata.get("hashes"):
model_hash = metadata["hashes"].get("model")
model_name = metadata.get("model_name")
file_name = ""
if model_name:
cleaned_name = re.split(r"[\\\\/]", model_name)[-1]
file_name = os.path.splitext(cleaned_name)[0]
if model_hash or model_name:
checkpoint_entry = {
'id': 0,
'modelId': 0,
'name': model_name or "Unknown Checkpoint",
'version': '',
'type': 'checkpoint',
'hash': model_hash or "",
'existsLocally': False,
'localPath': None,
'file_name': file_name,
'thumbnailUrl': '/loras_static/images/no-preview.png',
'baseModel': '',
'size': 0,
'downloadUrl': '',
'isDeleted': False
}
if metadata_provider and model_hash:
try:
civitai_info = await metadata_provider.get_model_by_hash(model_hash)
checkpoint_entry = await self.populate_checkpoint_from_civitai(
checkpoint_entry,
civitai_info
)
except Exception as e:
logger.error(f"Error fetching Civitai info for checkpoint hash {model_hash}: {e}")
if checkpoint_entry.get("baseModel"):
base_model_value = checkpoint_entry["baseModel"]
base_model_counts[base_model_value] = base_model_counts.get(base_model_value, 0) + 1
checkpoint = checkpoint_entry
# If no LoRAs from Civitai resources or to supplement, extract from metadata["hashes"]
if not loras or len(loras) == 0:
# Extract lora weights from extranet tags in prompt (for later use)
@@ -300,7 +410,9 @@ class AutomaticMetadataParser(RecipeMetadataParser):
# Try to get base model from resources or make educated guess
base_model = None
if base_model_counts:
if checkpoint and checkpoint.get("baseModel"):
base_model = checkpoint.get("baseModel")
elif base_model_counts:
# Use the most common base model from the loras
base_model = max(base_model_counts.items(), key=lambda x: x[1])[0]
@@ -317,6 +429,10 @@ class AutomaticMetadataParser(RecipeMetadataParser):
'gen_params': filtered_gen_params,
'from_automatic_metadata': True
}
if checkpoint:
result['checkpoint'] = checkpoint
result['model'] = checkpoint
return result

View File

@@ -23,13 +23,48 @@ class CivitaiApiMetadataParser(RecipeMetadataParser):
"""
if not metadata or not isinstance(metadata, dict):
return False
# Check for key markers specific to Civitai image metadata
return any([
"resources" in metadata,
"civitaiResources" in metadata,
"additionalResources" in metadata
])
def has_markers(payload: Dict[str, Any]) -> bool:
# Check for common CivitAI image metadata fields
civitai_image_fields = (
"resources",
"civitaiResources",
"additionalResources",
"hashes",
"prompt",
"negativePrompt",
"steps",
"sampler",
"cfgScale",
"seed",
"width",
"height",
"Model",
"Model hash"
)
return any(key in payload for key in civitai_image_fields)
# Check the main metadata object
if has_markers(metadata):
return True
# Check for LoRA hash patterns
hashes = metadata.get("hashes")
if isinstance(hashes, dict) and any(str(key).lower().startswith("lora:") for key in hashes):
return True
# Check nested meta object (common in CivitAI image responses)
nested_meta = metadata.get("meta")
if isinstance(nested_meta, dict):
if has_markers(nested_meta):
return True
# Also check for LoRA hash patterns in nested meta
hashes = nested_meta.get("hashes")
if isinstance(hashes, dict) and any(str(key).lower().startswith("lora:") for key in hashes):
return True
return False
async def parse_metadata(self, metadata, recipe_scanner=None, civitai_client=None) -> Dict[str, Any]:
"""Parse metadata from Civitai image format
@@ -45,11 +80,32 @@ class CivitaiApiMetadataParser(RecipeMetadataParser):
try:
# Get metadata provider instead of using civitai_client directly
metadata_provider = await get_default_metadata_provider()
# Civitai image responses may wrap the actual metadata inside a "meta" key
if (
isinstance(metadata, dict)
and "meta" in metadata
and isinstance(metadata["meta"], dict)
):
inner_meta = metadata["meta"]
if any(
key in inner_meta
for key in (
"resources",
"civitaiResources",
"additionalResources",
"hashes",
"prompt",
"negativePrompt",
)
):
metadata = inner_meta
# Initialize result structure
result = {
'base_model': None,
'loras': [],
'model': None,
'gen_params': {},
'from_civitai_image': True
}
@@ -61,8 +117,9 @@ class CivitaiApiMetadataParser(RecipeMetadataParser):
lora_hashes = {}
if "hashes" in metadata and isinstance(metadata["hashes"], dict):
for key, hash_value in metadata["hashes"].items():
if key.startswith("LORA:"):
lora_name = key.replace("LORA:", "")
key_str = str(key)
if key_str.lower().startswith("lora:"):
lora_name = key_str.split(":", 1)[1]
lora_hashes[lora_name] = hash_value
# Extract prompt and negative prompt
@@ -174,13 +231,48 @@ class CivitaiApiMetadataParser(RecipeMetadataParser):
# Process civitaiResources array
if "civitaiResources" in metadata and isinstance(metadata["civitaiResources"], list):
for resource in metadata["civitaiResources"]:
# Get unique identifier for deduplication
# Get resource type and identifier
resource_type = str(resource.get("type") or "").lower()
version_id = str(resource.get("modelVersionId", ""))
if resource_type == "checkpoint":
checkpoint_entry = {
'id': resource.get("modelVersionId", 0),
'modelId': resource.get("modelId", 0),
'name': resource.get("modelName", "Unknown Checkpoint"),
'version': resource.get("modelVersionName", ""),
'type': resource.get("type", "checkpoint"),
'existsLocally': False,
'localPath': None,
'file_name': resource.get("modelName", ""),
'hash': resource.get("hash", "") or "",
'thumbnailUrl': '/loras_static/images/no-preview.png',
'baseModel': '',
'size': 0,
'downloadUrl': '',
'isDeleted': False
}
if version_id and metadata_provider:
try:
civitai_info = await metadata_provider.get_model_version_info(version_id)
checkpoint_entry = await self.populate_checkpoint_from_civitai(
checkpoint_entry,
civitai_info
)
except Exception as e:
logger.error(f"Error fetching Civitai info for checkpoint version {version_id}: {e}")
if result["model"] is None:
result["model"] = checkpoint_entry
continue
# Skip if we've already added this LoRA
if version_id and version_id in added_loras:
continue
# Initialize lora entry
lora_entry = {
'id': resource.get("modelVersionId", 0),
@@ -196,31 +288,31 @@ class CivitaiApiMetadataParser(RecipeMetadataParser):
'downloadUrl': '',
'isDeleted': False
}
# Try to get info from Civitai if modelVersionId is available
if version_id and metadata_provider:
try:
# Use get_model_version_info instead of get_model_version
civitai_info = await metadata_provider.get_model_version_info(version_id)
populated_entry = await self.populate_lora_from_civitai(
lora_entry,
civitai_info,
recipe_scanner,
base_model_counts
)
if populated_entry is None:
continue # Skip invalid LoRA types
lora_entry = populated_entry
except Exception as e:
logger.error(f"Error fetching Civitai info for model version {version_id}: {e}")
# Track this LoRA in our deduplication dict
if version_id:
added_loras[version_id] = len(result["loras"])
result["loras"].append(lora_entry)
# Process additionalResources array

View File

@@ -1,5 +1,6 @@
"""Parser for meta format (Lora_N Model hash) metadata."""
import os
import re
import logging
from typing import Dict, Any
@@ -145,14 +146,53 @@ class MetaFormatParser(RecipeMetadataParser):
loras.append(lora_entry)
# Extract model information
model = None
if 'model' in metadata:
model = metadata['model']
# Extract checkpoint information from generic Model/Model hash fields
checkpoint = None
model_hash = metadata.get("model_hash")
model_name = metadata.get("model")
if model_hash or model_name:
cleaned_name = None
if model_name:
cleaned_name = re.split(r"[\\\\/]", model_name)[-1]
cleaned_name = os.path.splitext(cleaned_name)[0]
checkpoint_entry = {
'id': 0,
'modelId': 0,
'name': model_name or "Unknown Checkpoint",
'version': '',
'type': 'checkpoint',
'hash': model_hash or "",
'existsLocally': False,
'localPath': None,
'file_name': cleaned_name or (model_name or ""),
'thumbnailUrl': '/loras_static/images/no-preview.png',
'baseModel': '',
'size': 0,
'downloadUrl': '',
'isDeleted': False
}
if metadata_provider and model_hash:
try:
civitai_info = await metadata_provider.get_model_by_hash(model_hash)
checkpoint_entry = await self.populate_checkpoint_from_civitai(
checkpoint_entry,
civitai_info
)
except Exception as e:
logger.error(f"Error fetching Civitai info for checkpoint hash {model_hash}: {e}")
if checkpoint_entry.get("baseModel"):
base_model_value = checkpoint_entry["baseModel"]
base_model_counts[base_model_value] = base_model_counts.get(base_model_value, 0) + 1
checkpoint = checkpoint_entry
# Set base_model to the most common one from civitai_info
base_model = None
if base_model_counts:
# Set base_model to the most common one from civitai_info or checkpoint
base_model = checkpoint["baseModel"] if checkpoint and checkpoint.get("baseModel") else None
if not base_model and base_model_counts:
base_model = max(base_model_counts.items(), key=lambda x: x[1])[0]
# Extract generation parameters for recipe metadata
@@ -170,7 +210,8 @@ class MetaFormatParser(RecipeMetadataParser):
'loras': loras,
'gen_params': gen_params,
'raw_metadata': metadata,
'from_meta_format': True
'from_meta_format': True,
**({'checkpoint': checkpoint, 'model': checkpoint} if checkpoint else {})
}
except Exception as e:

View File

@@ -3,7 +3,7 @@
import re
import json
import logging
from typing import Dict, Any
from typing import Dict, Any, Optional
from ...config import config
from ..base import RecipeMetadataParser
from ..constants import GEN_PARAM_KEYS
@@ -16,6 +16,28 @@ class RecipeFormatParser(RecipeMetadataParser):
# Regular expression pattern for extracting recipe metadata
METADATA_MARKER = r'Recipe metadata: (\{.*\})'
async def _get_lora_from_version_index(self, recipe_scanner, model_version_id: Any) -> Optional[Dict[str, Any]]:
"""Return a cached LoRA entry by modelVersionId if available."""
if not recipe_scanner or not getattr(recipe_scanner, "_lora_scanner", None):
return None
try:
normalized_id = int(model_version_id)
except (TypeError, ValueError):
return None
try:
cache = await recipe_scanner._lora_scanner.get_cached_data()
except Exception as exc: # pragma: no cover - defensive logging
logger.debug("Unable to load lora cache for version lookup: %s", exc)
return None
if not cache or not getattr(cache, "version_index", None):
return None
return cache.version_index.get(normalized_id)
def is_metadata_matching(self, user_comment: str) -> bool:
"""Check if the user comment matches the metadata format"""
@@ -53,49 +75,110 @@ class RecipeFormatParser(RecipeMetadataParser):
'type': 'lora',
'weight': lora.get('strength', 1.0),
'file_name': lora.get('file_name', ''),
'hash': lora.get('hash', '')
'hash': lora.get('hash', ''),
'existsLocally': False,
'inLibrary': False,
'localPath': None,
'thumbnailUrl': '/loras_static/images/no-preview.png',
'size': 0
}
# Check if this LoRA exists locally by SHA256 hash
if lora.get('hash') and recipe_scanner:
if recipe_scanner:
lora_scanner = recipe_scanner._lora_scanner
exists_locally = lora_scanner.has_hash(lora['hash'])
if exists_locally:
lora_cache = await lora_scanner.get_cached_data()
lora_item = next((item for item in lora_cache.raw_data if item['sha256'].lower() == lora['hash'].lower()), None)
if lora_item:
if lora.get('hash'):
exists_locally = lora_scanner.has_hash(lora['hash'])
if exists_locally:
lora_cache = await lora_scanner.get_cached_data()
lora_item = next((item for item in lora_cache.raw_data if item['sha256'].lower() == lora['hash'].lower()), None)
if lora_item:
lora_entry['existsLocally'] = True
lora_entry['inLibrary'] = True
lora_entry['localPath'] = lora_item['file_path']
lora_entry['file_name'] = lora_item['file_name']
lora_entry['size'] = lora_item['size']
lora_entry['thumbnailUrl'] = config.get_preview_static_url(lora_item['preview_url'])
else:
lora_entry['existsLocally'] = False
lora_entry['inLibrary'] = False
lora_entry['localPath'] = None
# If we still don't have a local match, try matching by modelVersionId
if not lora_entry['existsLocally'] and lora.get('modelVersionId') is not None:
cached_lora = await self._get_lora_from_version_index(recipe_scanner, lora.get('modelVersionId'))
if cached_lora:
lora_entry['existsLocally'] = True
lora_entry['localPath'] = lora_item['file_path']
lora_entry['file_name'] = lora_item['file_name']
lora_entry['size'] = lora_item['size']
lora_entry['thumbnailUrl'] = config.get_preview_static_url(lora_item['preview_url'])
else:
lora_entry['existsLocally'] = False
lora_entry['localPath'] = None
# Try to get additional info from Civitai if we have a model version ID
if lora.get('modelVersionId') and metadata_provider:
try:
civitai_info_tuple = await metadata_provider.get_model_version_info(lora['modelVersionId'])
# Populate lora entry with Civitai info
populated_entry = await self.populate_lora_from_civitai(
lora_entry,
civitai_info_tuple,
recipe_scanner,
None, # No need to track base model counts
lora['hash']
)
if populated_entry is None:
continue # Skip invalid LoRA types
lora_entry = populated_entry
except Exception as e:
logger.error(f"Error fetching Civitai info for LoRA: {e}")
lora_entry['thumbnailUrl'] = '/loras_static/images/no-preview.png'
lora_entry['inLibrary'] = True
lora_entry['localPath'] = cached_lora.get('file_path')
lora_entry['file_name'] = cached_lora.get('file_name') or lora_entry['file_name']
lora_entry['size'] = cached_lora.get('size', lora_entry['size'])
if cached_lora.get('sha256'):
lora_entry['hash'] = cached_lora['sha256']
preview_url = cached_lora.get('preview_url')
if preview_url:
lora_entry['thumbnailUrl'] = config.get_preview_static_url(preview_url)
# Try to get additional info from Civitai if we have a model version ID and still missing locally
if not lora_entry['existsLocally'] and lora.get('modelVersionId') and metadata_provider:
try:
civitai_info_tuple = await metadata_provider.get_model_version_info(lora['modelVersionId'])
# Populate lora entry with Civitai info
populated_entry = await self.populate_lora_from_civitai(
lora_entry,
civitai_info_tuple,
recipe_scanner,
None, # No need to track base model counts
lora_entry.get('hash', '')
)
if populated_entry is None:
continue # Skip invalid LoRA types
lora_entry = populated_entry
except Exception as e:
logger.error(f"Error fetching Civitai info for LoRA: {e}")
lora_entry['thumbnailUrl'] = '/loras_static/images/no-preview.png'
loras.append(lora_entry)
logger.info(f"Found {len(loras)} loras in recipe metadata")
# Process checkpoint information if present
checkpoint = None
checkpoint_data = recipe_metadata.get('checkpoint') or {}
if isinstance(checkpoint_data, dict) and checkpoint_data:
version_id = checkpoint_data.get('modelVersionId') or checkpoint_data.get('id')
checkpoint_entry = {
'id': version_id or 0,
'modelId': checkpoint_data.get('modelId', 0),
'name': checkpoint_data.get('name', 'Unknown Checkpoint'),
'version': checkpoint_data.get('version', ''),
'type': checkpoint_data.get('type', 'checkpoint'),
'hash': checkpoint_data.get('hash', ''),
'existsLocally': False,
'localPath': None,
'file_name': checkpoint_data.get('file_name', ''),
'thumbnailUrl': '/loras_static/images/no-preview.png',
'baseModel': '',
'size': 0,
'downloadUrl': '',
'isDeleted': False
}
if metadata_provider:
try:
civitai_info = None
if version_id:
civitai_info = await metadata_provider.get_model_version_info(str(version_id))
elif checkpoint_entry.get('hash'):
civitai_info = await metadata_provider.get_model_by_hash(checkpoint_entry['hash'])
if civitai_info:
checkpoint_entry = await self.populate_checkpoint_from_civitai(checkpoint_entry, civitai_info)
except Exception as e:
logger.error(f"Error fetching Civitai info for checkpoint in recipe metadata: {e}")
checkpoint = checkpoint_entry
# Filter gen_params to only include recognized keys
filtered_gen_params = {}
@@ -105,12 +188,13 @@ class RecipeFormatParser(RecipeMetadataParser):
filtered_gen_params[key] = value
return {
'base_model': recipe_metadata.get('base_model', ''),
'base_model': checkpoint['baseModel'] if checkpoint and checkpoint.get('baseModel') else recipe_metadata.get('base_model', ''),
'loras': loras,
'gen_params': filtered_gen_params,
'tags': recipe_metadata.get('tags', []),
'title': recipe_metadata.get('title', ''),
'from_recipe_metadata': True
'from_recipe_metadata': True,
**({'checkpoint': checkpoint, 'model': checkpoint} if checkpoint else {})
}
except Exception as e:

View File

@@ -126,6 +126,7 @@ class BaseModelRoutes(ABC):
metadata_manager=MetadataManager,
metadata_loader=self._metadata_sync_service.load_local_metadata,
recipe_scanner_factory=ServiceRegistry.get_recipe_scanner,
update_service=self._model_update_service,
)
self._handler_set = None
self._handler_mapping = None
@@ -297,4 +298,3 @@ class BaseModelRoutes(ABC):
if self._model_update_service is None:
raise RuntimeError("Model update service has not been attached")
return self._model_update_service

View File

@@ -191,6 +191,8 @@ class BaseRecipeRoutes:
logger=logger,
persistence_service=persistence_service,
analysis_service=analysis_service,
downloader_factory=get_downloader,
civitai_client_getter=civitai_client_getter,
)
analysis = RecipeAnalysisHandler(
ensure_dependencies_ready=self.ensure_dependencies_ready,
@@ -214,4 +216,3 @@ class BaseRecipeRoutes:
analysis=analysis,
sharing=sharing,
)

View File

@@ -1,4 +1,5 @@
import logging
from typing import Dict
from aiohttp import web
from .base_model_routes import BaseModelRoutes
@@ -51,6 +52,19 @@ class CheckpointRoutes(BaseModelRoutes):
def _get_expected_model_types(self) -> str:
"""Get expected model types string for error messages"""
return "Checkpoint"
def _parse_specific_params(self, request: web.Request) -> Dict:
"""Parse Checkpoint-specific parameters"""
params: Dict = {}
if 'checkpoint_hash' in request.query:
params['hash_filters'] = {'single_hash': request.query['checkpoint_hash'].lower()}
elif 'checkpoint_hashes' in request.query:
params['hash_filters'] = {
'multiple_hashes': [h.lower() for h in request.query['checkpoint_hashes'].split(',')]
}
return params
async def get_checkpoint_info(self, request: web.Request) -> web.Response:
"""Get detailed information for a specific checkpoint by name"""

View File

@@ -180,6 +180,7 @@ class SettingsHandler:
"download_path_templates",
"enable_metadata_archive_db",
"language",
"use_portable_settings",
"proxy_enabled",
"proxy_type",
"proxy_host",
@@ -200,6 +201,8 @@ class SettingsHandler:
"priority_tags",
"model_card_footer_action",
"model_name_display",
"update_flag_strategy",
"auto_organize_exclusions",
)
_PROXY_KEYS = {"proxy_enabled", "proxy_host", "proxy_port", "proxy_username", "proxy_password", "proxy_type"}

View File

@@ -6,7 +6,7 @@ import json
import logging
import os
from dataclasses import dataclass
from typing import Awaitable, Callable, Dict, Iterable, List, Mapping, Optional
from typing import Any, Awaitable, Callable, Dict, Iterable, List, Mapping, Optional
from aiohttp import web
import jinja2
@@ -16,7 +16,7 @@ from ...services.download_coordinator import DownloadCoordinator
from ...services.metadata_sync_service import MetadataSyncService
from ...services.model_file_service import ModelMoveService
from ...services.preview_asset_service import PreviewAssetService
from ...services.settings_manager import SettingsManager
from ...services.settings_manager import SettingsManager, get_settings_manager
from ...services.tag_update_service import TagUpdateService
from ...services.use_cases import (
AutoOrganizeInProgressError,
@@ -30,9 +30,17 @@ from ...services.use_cases import (
from ...services.websocket_manager import WebSocketManager
from ...services.websocket_progress_callback import WebSocketProgressCallback
from ...services.errors import RateLimitError, ResourceNotFoundError
from ...utils.civitai_utils import resolve_license_payload
from ...utils.file_utils import calculate_sha256
from ...utils.metadata_manager import MetadataManager
LICENSE_FIELDS = (
"allowNoCredit",
"allowCommercialUse",
"allowDerivatives",
"allowDifferentLicense",
)
class ModelPageView:
"""Render the HTML view for model listings."""
@@ -144,7 +152,30 @@ class ModelListingHandler:
fuzzy_search = request.query.get("fuzzy_search", "false").lower() == "true"
base_models = request.query.getall("base_model", [])
tags = request.query.getall("tag", [])
model_types = list(request.query.getall("model_type", []))
model_types.extend(request.query.getall("civitai_model_type", []))
# Support legacy ?tag=foo plus new ?tag_include/foo & ?tag_exclude parameters
legacy_tags = request.query.getall("tag", [])
if not legacy_tags:
legacy_csv = request.query.get("tags")
if legacy_csv:
legacy_tags = [tag.strip() for tag in legacy_csv.split(",") if tag.strip()]
include_tags = request.query.getall("tag_include", [])
exclude_tags = request.query.getall("tag_exclude", [])
tag_filters: Dict[str, str] = {}
for tag in legacy_tags:
if tag:
tag_filters[tag] = "include"
for tag in include_tags:
if tag:
tag_filters[tag] = "include"
for tag in exclude_tags:
if tag:
tag_filters[tag] = "exclude"
favorites_only = request.query.get("favorites_only", "false").lower() == "true"
search_options = {
@@ -167,6 +198,19 @@ class ModelListingHandler:
pass
update_available_only = request.query.get("update_available_only", "false").lower() == "true"
# New license-based query filters
credit_required = request.query.get("credit_required")
if credit_required is not None:
credit_required = credit_required.lower() not in ("false", "0", "")
else:
credit_required = None # None means no filter applied
allow_selling_generated_content = request.query.get("allow_selling_generated_content")
if allow_selling_generated_content is not None:
allow_selling_generated_content = allow_selling_generated_content.lower() not in ("false", "0", "")
else:
allow_selling_generated_content = None # None means no filter applied
return {
"page": page,
@@ -176,11 +220,14 @@ class ModelListingHandler:
"search": search,
"fuzzy_search": fuzzy_search,
"base_models": base_models,
"tags": tags,
"tags": tag_filters,
"search_options": search_options,
"hash_filters": hash_filters,
"favorites_only": favorites_only,
"update_available_only": update_available_only,
"credit_required": credit_required,
"allow_selling_generated_content": allow_selling_generated_content,
"model_types": model_types,
**self._parse_specific_params(request),
}
@@ -513,6 +560,17 @@ class ModelQueryHandler:
self._logger.error("Error retrieving base models: %s", exc)
return web.json_response({"success": False, "error": str(exc)}, status=500)
async def get_model_types(self, request: web.Request) -> web.Response:
try:
limit = int(request.query.get("limit", "20"))
if limit < 1 or limit > 100:
limit = 20
model_types = await self._service.get_model_types(limit)
return web.json_response({"success": True, "model_types": model_types})
except Exception as exc:
self._logger.error("Error retrieving model types: %s", exc)
return web.json_response({"success": False, "error": str(exc)}, status=500)
async def scan_models(self, request: web.Request) -> web.Response:
try:
full_rebuild = request.query.get("full_rebuild", "false").lower() == "true"
@@ -623,9 +681,16 @@ class ModelQueryHandler:
model_name = request.query.get("name")
if not model_name:
return web.Response(text=f"{self._service.model_type.capitalize()} file name is required", status=400)
include_license_flags = (request.query.get("license_flags", "").strip().lower() in {"1", "true", "yes", "on"})
preview_url = await self._service.get_model_preview_url(model_name)
if preview_url:
return web.json_response({"success": True, "preview_url": preview_url})
response_payload: dict[str, object] = {"success": True, "preview_url": preview_url}
if include_license_flags:
model_data = await self._service.get_model_info_by_name(model_name)
license_flags = (model_data or {}).get("license_flags")
if license_flags is not None:
response_payload["license_flags"] = int(license_flags)
return web.json_response(response_payload)
return web.json_response({"success": False, "error": f"No preview URL found for the specified {self._service.model_type}"}, status=404)
except Exception as exc:
self._logger.error("Error getting %s preview URL: %s", self._service.model_type, exc, exc_info=True)
@@ -986,16 +1051,23 @@ class ModelAutoOrganizeHandler:
async def auto_organize_models(self, request: web.Request) -> web.Response:
try:
file_paths = None
exclusion_patterns = None
settings_manager = get_settings_manager()
if request.method == "POST":
try:
data = await request.json()
file_paths = data.get("file_paths")
if "exclusion_patterns" in data:
exclusion_patterns = settings_manager.normalize_auto_organize_exclusions(
data.get("exclusion_patterns")
)
except Exception: # pragma: no cover - permissive path
pass
result = await self._use_case.execute(
file_paths=file_paths,
progress_callback=self._progress_callback,
exclusion_patterns=exclusion_patterns,
)
return web.json_response(result.to_dict())
except AutoOrganizeInProgressError:
@@ -1040,6 +1112,77 @@ class ModelUpdateHandler:
self._metadata_provider_selector = metadata_provider_selector
self._logger = logger
async def fetch_missing_civitai_license_data(self, request: web.Request) -> web.Response:
payload = await self._read_json(request)
target_model_ids = self._extract_target_model_ids(payload)
provider = await self._get_civitai_provider()
if provider is None:
return web.json_response(
{"success": False, "error": "Civitai provider not available"},
status=503,
)
try:
cache = await self._service.scanner.get_cached_data()
except Exception as exc:
self._logger.error("Failed to load cache for license refresh: %s", exc, exc_info=True)
cache = None
target_set = set(target_model_ids) if target_model_ids is not None else None
candidates = await self._collect_models_missing_license(cache, target_set)
if not candidates:
return web.json_response({"success": True, "updated": []})
model_ids = sorted(candidates.keys())
try:
license_map = await self._fetch_license_info(provider, model_ids)
except RateLimitError as exc:
return web.json_response(
{"success": False, "error": str(exc) or "Rate limited"},
status=429,
)
except Exception as exc: # pragma: no cover - defensive log
self._logger.error("Failed to fetch license info: %s", exc, exc_info=True)
return web.json_response({"success": False, "error": str(exc)}, status=500)
updated: List[Dict[str, str]] = []
errors: List[Dict[str, str]] = []
for model_id in model_ids:
license_payload = license_map.get(model_id)
if not license_payload:
continue
resolved_payload = resolve_license_payload(license_payload)
for context in candidates.get(model_id, []):
metadata_path = context["file_path"]
metadata_payload = context["metadata"]
civitai_section = metadata_payload.setdefault("civitai", {})
model_section = civitai_section.get("model")
if not isinstance(model_section, Mapping):
model_section = {}
model_section.update(resolved_payload)
civitai_section["model"] = model_section
metadata_payload["civitai"] = civitai_section
try:
await MetadataManager.save_metadata(metadata_path, metadata_payload)
updated.append({"modelId": model_id, "filePath": metadata_path})
except Exception as exc:
self._logger.error(
"Failed to save metadata for %s: %s",
metadata_path,
exc,
exc_info=True,
)
errors.append({"filePath": metadata_path, "error": str(exc)})
response_payload = {"success": True, "updated": updated}
missing_model_ids = [mid for mid in model_ids if mid not in license_map]
if missing_model_ids:
response_payload["missingModelIds"] = missing_model_ids
if errors:
response_payload["errors"] = errors
return web.json_response(response_payload)
async def refresh_model_updates(self, request: web.Request) -> web.Response:
payload = await self._read_json(request)
force_refresh = self._parse_bool(request.query.get("force")) or self._parse_bool(
@@ -1204,6 +1347,132 @@ class ModelUpdateHandler:
self._logger.error("Failed to acquire civitai provider: %s", exc, exc_info=True)
return None
async def _collect_models_missing_license(
self,
cache,
target_model_ids: Optional[set[int]],
) -> Dict[int, List[Dict[str, Any]]]:
entries: Dict[int, List[Dict[str, Any]]] = {}
if cache is None:
return entries
raw_data = getattr(cache, "raw_data", None) or []
seen_paths: set[str] = set()
target_set = target_model_ids
for item in raw_data:
if not isinstance(item, Mapping):
continue
file_path = item.get("file_path")
if not isinstance(file_path, str) or not file_path or file_path in seen_paths:
continue
seen_paths.add(file_path)
civitai_entry = item.get("civitai")
if not isinstance(civitai_entry, Mapping):
continue
model_id = self._normalize_model_id(civitai_entry.get("modelId"))
if model_id is None:
continue
if target_set is not None and model_id not in target_set:
continue
try:
metadata_obj, should_skip = await MetadataManager.load_metadata(file_path)
except Exception as exc:
self._logger.debug("Failed to load metadata for %s: %s", file_path, exc)
continue
if metadata_obj is None or should_skip:
continue
metadata_payload = self._convert_metadata_to_dict(metadata_obj)
civitai_payload = metadata_payload.get("civitai")
if not isinstance(civitai_payload, Mapping):
civitai_payload = {}
model_payload = civitai_payload.get("model")
if not isinstance(model_payload, Mapping):
model_payload = {}
missing = [key for key in LICENSE_FIELDS if key not in model_payload]
if not missing:
continue
civitai_payload["model"] = model_payload
metadata_payload["civitai"] = civitai_payload
entries.setdefault(model_id, []).append(
{"file_path": file_path, "metadata": metadata_payload}
)
return entries
async def _fetch_license_info(
self,
provider,
model_ids: List[int],
) -> Dict[int, Dict[str, Any]]:
if not model_ids:
return {}
BATCH_SIZE = 100
aggregated: Dict[int, Dict[str, Any]] = {}
for start in range(0, len(model_ids), BATCH_SIZE):
chunk = model_ids[start : start + BATCH_SIZE]
response = await provider.get_model_versions_bulk(chunk)
if not isinstance(response, Mapping):
continue
for raw_id, payload in response.items():
normalized_id = self._normalize_model_id(raw_id)
if normalized_id is None or not isinstance(payload, Mapping):
continue
license_data: Dict[str, Any] = {}
for field in LICENSE_FIELDS:
license_data[field] = payload.get(field)
aggregated[normalized_id] = license_data
return aggregated
def _extract_target_model_ids(self, payload: Dict) -> Optional[List[int]]:
if not isinstance(payload, Mapping):
return None
raw_ids = payload.get("modelIds")
if raw_ids is None:
raw_ids = payload.get("model_ids")
if not isinstance(raw_ids, (list, tuple, set)):
return None
normalized: List[int] = []
for candidate in raw_ids:
model_id = self._normalize_model_id(candidate)
if model_id is not None:
normalized.append(model_id)
if not normalized:
return None
return sorted(set(normalized))
@staticmethod
def _convert_metadata_to_dict(metadata: Any) -> Dict[str, Any]:
if metadata is None:
return {}
to_dict = getattr(metadata, "to_dict", None)
if callable(to_dict):
try:
return to_dict()
except Exception:
pass
if isinstance(metadata, Mapping):
return dict(metadata)
return {}
async def _read_json(self, request: web.Request) -> Dict:
if not request.can_read_body:
return {}
@@ -1331,6 +1600,7 @@ class ModelHandlerSet:
"verify_duplicates": self.management.verify_duplicates,
"get_top_tags": self.query.get_top_tags,
"get_base_models": self.query.get_base_models,
"get_model_types": self.query.get_model_types,
"scan_models": self.query.scan_models,
"get_model_roots": self.query.get_model_roots,
"get_folders": self.query.get_folders,
@@ -1358,6 +1628,7 @@ class ModelHandlerSet:
"get_model_description": self.query.get_model_description,
"get_relative_paths": self.query.get_relative_paths,
"refresh_model_updates": self.updates.refresh_model_updates,
"fetch_missing_civitai_license_data": self.updates.fetch_missing_civitai_license_data,
"set_model_update_ignore": self.updates.set_model_update_ignore,
"set_version_update_ignore": self.updates.set_version_update_ignore,
"get_model_update_status": self.updates.get_model_update_status,

View File

@@ -4,8 +4,10 @@ from __future__ import annotations
import json
import logging
import os
import re
import tempfile
from dataclasses import dataclass
from typing import Any, Awaitable, Callable, Dict, Mapping, Optional
from typing import Any, Awaitable, Callable, Dict, List, Mapping, Optional
from aiohttp import web
@@ -20,6 +22,7 @@ from ...services.recipes import (
RecipeSharingService,
RecipeValidationError,
)
from ...services.metadata_service import get_default_metadata_provider
Logger = logging.Logger
EnsureDependenciesCallable = Callable[[], Awaitable[None]]
@@ -45,6 +48,7 @@ class RecipeHandlerSet:
"render_page": self.page_view.render_page,
"list_recipes": self.listing.list_recipes,
"get_recipe": self.listing.get_recipe,
"import_remote_recipe": self.management.import_remote_recipe,
"analyze_uploaded_image": self.analysis.analyze_uploaded_image,
"analyze_local_image": self.analysis.analyze_local_image,
"save_recipe": self.management.save_recipe,
@@ -152,14 +156,31 @@ class RecipeListingHandler:
"lora_model": request.query.get("search_lora_model", "true").lower() == "true",
}
filters: Dict[str, list[str]] = {}
filters: Dict[str, Any] = {}
base_models = request.query.get("base_models")
if base_models:
filters["base_model"] = base_models.split(",")
tags = request.query.get("tags")
if tags:
filters["tags"] = tags.split(",")
tag_filters: Dict[str, str] = {}
legacy_tags = request.query.get("tags")
if legacy_tags:
for tag in legacy_tags.split(","):
tag = tag.strip()
if tag:
tag_filters[tag] = "include"
include_tags = request.query.getall("tag_include", [])
for tag in include_tags:
if tag:
tag_filters[tag] = "include"
exclude_tags = request.query.getall("tag_exclude", [])
for tag in exclude_tags:
if tag:
tag_filters[tag] = "exclude"
if tag_filters:
filters["tags"] = tag_filters
lora_hash = request.query.get("lora_hash")
@@ -387,12 +408,16 @@ class RecipeManagementHandler:
logger: Logger,
persistence_service: RecipePersistenceService,
analysis_service: RecipeAnalysisService,
downloader_factory,
civitai_client_getter: CivitaiClientGetter,
) -> None:
self._ensure_dependencies_ready = ensure_dependencies_ready
self._recipe_scanner_getter = recipe_scanner_getter
self._logger = logger
self._persistence_service = persistence_service
self._analysis_service = analysis_service
self._downloader_factory = downloader_factory
self._civitai_client_getter = civitai_client_getter
async def save_recipe(self, request: web.Request) -> web.Response:
try:
@@ -419,6 +444,64 @@ class RecipeManagementHandler:
self._logger.error("Error saving recipe: %s", exc, exc_info=True)
return web.json_response({"error": str(exc)}, status=500)
async def import_remote_recipe(self, request: web.Request) -> web.Response:
try:
await self._ensure_dependencies_ready()
recipe_scanner = self._recipe_scanner_getter()
if recipe_scanner is None:
raise RuntimeError("Recipe scanner unavailable")
params = request.rel_url.query
image_url = params.get("image_url")
name = params.get("name")
resources_raw = params.get("resources")
if not image_url:
raise RecipeValidationError("Missing required field: image_url")
if not name:
raise RecipeValidationError("Missing required field: name")
if not resources_raw:
raise RecipeValidationError("Missing required field: resources")
checkpoint_entry, lora_entries = self._parse_resources_payload(resources_raw)
gen_params = self._parse_gen_params(params.get("gen_params"))
metadata: Dict[str, Any] = {
"base_model": params.get("base_model", "") or "",
"loras": lora_entries,
}
source_path = params.get("source_path")
if source_path:
metadata["source_path"] = source_path
if gen_params is not None:
metadata["gen_params"] = gen_params
if checkpoint_entry:
metadata["checkpoint"] = checkpoint_entry
gen_params_ref = metadata.setdefault("gen_params", {})
if "checkpoint" not in gen_params_ref:
gen_params_ref["checkpoint"] = checkpoint_entry
base_model_from_metadata = await self._resolve_base_model_from_checkpoint(checkpoint_entry)
if base_model_from_metadata:
metadata["base_model"] = base_model_from_metadata
tags = self._parse_tags(params.get("tags"))
image_bytes = await self._download_image_bytes(image_url)
result = await self._persistence_service.save_recipe(
recipe_scanner=recipe_scanner,
image_bytes=image_bytes,
image_base64=None,
name=name,
tags=tags,
metadata=metadata,
)
return web.json_response(result.payload, status=result.status)
except RecipeValidationError as exc:
return web.json_response({"error": str(exc)}, status=400)
except RecipeDownloadError as exc:
return web.json_response({"error": str(exc)}, status=400)
except Exception as exc:
self._logger.error("Error importing recipe from remote source: %s", exc, exc_info=True)
return web.json_response({"error": str(exc)}, status=500)
async def delete_recipe(self, request: web.Request) -> web.Response:
try:
await self._ensure_dependencies_ready()
@@ -578,6 +661,140 @@ class RecipeManagementHandler:
"metadata": metadata,
}
def _parse_tags(self, tag_text: Optional[str]) -> list[str]:
if not tag_text:
return []
return [tag.strip() for tag in tag_text.split(",") if tag.strip()]
def _parse_gen_params(self, payload: Optional[str]) -> Optional[Dict[str, Any]]:
if payload is None:
return None
if payload == "":
return {}
try:
parsed = json.loads(payload)
except json.JSONDecodeError as exc:
raise RecipeValidationError(f"Invalid gen_params payload: {exc}") from exc
if parsed is None:
return {}
if not isinstance(parsed, dict):
raise RecipeValidationError("gen_params payload must be an object")
return parsed
def _parse_resources_payload(self, payload_raw: str) -> tuple[Optional[Dict[str, Any]], List[Dict[str, Any]]]:
try:
payload = json.loads(payload_raw)
except json.JSONDecodeError as exc:
raise RecipeValidationError(f"Invalid resources payload: {exc}") from exc
if not isinstance(payload, list):
raise RecipeValidationError("Resources payload must be a list")
checkpoint_entry: Optional[Dict[str, Any]] = None
lora_entries: List[Dict[str, Any]] = []
for resource in payload:
if not isinstance(resource, dict):
continue
resource_type = str(resource.get("type") or "").lower()
if resource_type == "checkpoint":
checkpoint_entry = self._build_checkpoint_entry(resource)
elif resource_type in {"lora", "lycoris"}:
lora_entries.append(self._build_lora_entry(resource))
return checkpoint_entry, lora_entries
def _build_checkpoint_entry(self, resource: Dict[str, Any]) -> Dict[str, Any]:
return {
"type": resource.get("type", "checkpoint"),
"modelId": self._safe_int(resource.get("modelId")),
"modelVersionId": self._safe_int(resource.get("modelVersionId")),
"modelName": resource.get("modelName", ""),
"modelVersionName": resource.get("modelVersionName", ""),
}
def _build_lora_entry(self, resource: Dict[str, Any]) -> Dict[str, Any]:
weight_raw = resource.get("weight", 1.0)
try:
weight = float(weight_raw)
except (TypeError, ValueError):
weight = 1.0
return {
"file_name": resource.get("modelName", ""),
"weight": weight,
"id": self._safe_int(resource.get("modelVersionId")),
"name": resource.get("modelName", ""),
"version": resource.get("modelVersionName", ""),
"isDeleted": False,
"exclude": False,
}
async def _download_image_bytes(self, image_url: str) -> bytes:
civitai_client = self._civitai_client_getter()
downloader = await self._downloader_factory()
temp_path = None
try:
with tempfile.NamedTemporaryFile(delete=False) as temp_file:
temp_path = temp_file.name
download_url = image_url
civitai_match = re.match(r"https://civitai\.com/images/(\d+)", image_url)
if civitai_match:
if civitai_client is None:
raise RecipeDownloadError("Civitai client unavailable for image download")
image_info = await civitai_client.get_image_info(civitai_match.group(1))
if not image_info:
raise RecipeDownloadError("Failed to fetch image information from Civitai")
download_url = image_info.get("url")
if not download_url:
raise RecipeDownloadError("No image URL found in Civitai response")
success, result = await downloader.download_file(download_url, temp_path, use_auth=False)
if not success:
raise RecipeDownloadError(f"Failed to download image: {result}")
with open(temp_path, "rb") as file_obj:
return file_obj.read()
except RecipeDownloadError:
raise
except RecipeValidationError:
raise
except Exception as exc: # pragma: no cover - defensive guard
raise RecipeValidationError(f"Unable to download image: {exc}") from exc
finally:
if temp_path:
try:
os.unlink(temp_path)
except FileNotFoundError:
pass
def _safe_int(self, value: Any) -> int:
try:
return int(value)
except (TypeError, ValueError):
return 0
async def _resolve_base_model_from_checkpoint(self, checkpoint_entry: Dict[str, Any]) -> str:
version_id = self._safe_int(checkpoint_entry.get("modelVersionId"))
if not version_id:
return ""
try:
provider = await get_default_metadata_provider()
if not provider:
return ""
version_info = await provider.get_model_version_info(version_id)
if isinstance(version_info, tuple):
version_info = version_info[0]
if isinstance(version_info, dict):
base_model = version_info.get("baseModel") or ""
return str(base_model) if base_model is not None else ""
except Exception as exc: # pragma: no cover - defensive logging
self._logger.warning("Failed to resolve base model from checkpoint metadata: %s", exc)
return ""
class RecipeAnalysisHandler:
"""Analyze images to extract recipe metadata."""

View File

@@ -39,6 +39,7 @@ COMMON_ROUTE_DEFINITIONS: tuple[RouteDefinition, ...] = (
RouteDefinition("GET", "/api/lm/{prefix}/auto-organize-progress", "get_auto_organize_progress"),
RouteDefinition("GET", "/api/lm/{prefix}/top-tags", "get_top_tags"),
RouteDefinition("GET", "/api/lm/{prefix}/base-models", "get_base_models"),
RouteDefinition("GET", "/api/lm/{prefix}/model-types", "get_model_types"),
RouteDefinition("GET", "/api/lm/{prefix}/scan", "scan_models"),
RouteDefinition("GET", "/api/lm/{prefix}/roots", "get_model_roots"),
RouteDefinition("GET", "/api/lm/{prefix}/folders", "get_folders"),
@@ -56,6 +57,7 @@ COMMON_ROUTE_DEFINITIONS: tuple[RouteDefinition, ...] = (
RouteDefinition("GET", "/api/lm/{prefix}/civitai/model/version/{modelVersionId}", "get_civitai_model_by_version"),
RouteDefinition("GET", "/api/lm/{prefix}/civitai/model/hash/{hash}", "get_civitai_model_by_hash"),
RouteDefinition("POST", "/api/lm/{prefix}/updates/refresh", "refresh_model_updates"),
RouteDefinition("POST", "/api/lm/{prefix}/updates/fetch-missing-license", "fetch_missing_civitai_license_data"),
RouteDefinition("POST", "/api/lm/{prefix}/updates/ignore", "set_model_update_ignore"),
RouteDefinition("POST", "/api/lm/{prefix}/updates/ignore-version", "set_version_update_ignore"),
RouteDefinition("GET", "/api/lm/{prefix}/updates/status/{model_id}", "get_model_update_status"),
@@ -103,4 +105,3 @@ class ModelRouteRegistrar:
add_method_name = self._METHOD_MAP[method.upper()]
add_method = getattr(self._app.router, add_method_name)
add_method(path, handler)

View File

@@ -20,6 +20,7 @@ ROUTE_DEFINITIONS: tuple[RouteDefinition, ...] = (
RouteDefinition("GET", "/loras/recipes", "render_page"),
RouteDefinition("GET", "/api/lm/recipes", "list_recipes"),
RouteDefinition("GET", "/api/lm/recipe/{recipe_id}", "get_recipe"),
RouteDefinition("GET", "/api/lm/recipes/import-remote", "import_remote_recipe"),
RouteDefinition("POST", "/api/lm/recipes/analyze-image", "analyze_uploaded_image"),
RouteDefinition("POST", "/api/lm/recipes/analyze-local-image", "analyze_local_image"),
RouteDefinition("POST", "/api/lm/recipes/save", "save_recipe"),
@@ -61,4 +62,3 @@ class RecipeRouteRegistrar:
add_method_name = self._METHOD_MAP[method.upper()]
add_method = getattr(self._app.router, add_method_name)
add_method(path, handler)

View File

@@ -344,6 +344,11 @@ class UpdateRoutes:
origin.fetch()
if nightly:
# Reset to discard any local changes
repo.git.reset('--hard')
# Clean untracked files
repo.git.clean('-fd')
# Switch to main branch and pull latest
main_branch = 'main'
if main_branch not in [branch.name for branch in repo.branches]:
@@ -357,6 +362,11 @@ class UpdateRoutes:
new_version = f"main-{repo.head.commit.hexsha[:7]}"
else:
# Reset to discard any local changes
repo.git.reset('--hard')
# Clean untracked files
repo.git.clean('-fd')
# Get latest release tag
tags = sorted(repo.tags, key=lambda t: t.commit.committed_datetime, reverse=True)
if not tags:

View File

@@ -1,12 +1,21 @@
from abc import ABC, abstractmethod
import asyncio
from typing import Dict, List, Optional, Type, TYPE_CHECKING
from typing import Any, Dict, List, Optional, Type, TYPE_CHECKING
import logging
import os
from ..utils.constants import VALID_LORA_TYPES
from ..utils.models import BaseModelMetadata
from ..utils.metadata_manager import MetadataManager
from .model_query import FilterCriteria, ModelCacheRepository, ModelFilterSet, SearchStrategy, SettingsProvider
from .model_query import (
FilterCriteria,
ModelCacheRepository,
ModelFilterSet,
SearchStrategy,
SettingsProvider,
normalize_civitai_model_type,
resolve_civitai_model_type,
)
from .settings_manager import get_settings_manager
logger = logging.getLogger(__name__)
@@ -59,11 +68,14 @@ class BaseModelService(ABC):
search: str = None,
fuzzy_search: bool = False,
base_models: list = None,
tags: list = None,
model_types: list = None,
tags: Optional[Dict[str, str]] = None,
search_options: dict = None,
hash_filters: dict = None,
favorites_only: bool = False,
update_available_only: bool = False,
credit_required: Optional[bool] = None,
allow_selling_generated_content: Optional[bool] = None,
**kwargs,
) -> Dict:
"""Get paginated and filtered model data"""
@@ -78,6 +90,7 @@ class BaseModelService(ABC):
sorted_data,
folder=folder,
base_models=base_models,
model_types=model_types,
tags=tags,
favorites_only=favorites_only,
search_options=search_options,
@@ -93,6 +106,13 @@ class BaseModelService(ABC):
filtered_data = await self._apply_specific_filters(filtered_data, **kwargs)
# Apply license-based filters
if credit_required is not None:
filtered_data = await self._apply_credit_required_filter(filtered_data, credit_required)
if allow_selling_generated_content is not None:
filtered_data = await self._apply_allow_selling_filter(filtered_data, allow_selling_generated_content)
annotated_for_filter: Optional[List[Dict]] = None
if update_available_only:
annotated_for_filter = await self._annotate_update_flags(filtered_data)
@@ -140,7 +160,8 @@ class BaseModelService(ABC):
data: List[Dict],
folder: str = None,
base_models: list = None,
tags: list = None,
model_types: list = None,
tags: Optional[Dict[str, str]] = None,
favorites_only: bool = False,
search_options: dict = None,
) -> List[Dict]:
@@ -149,6 +170,7 @@ class BaseModelService(ABC):
criteria = FilterCriteria(
folder=folder,
base_models=base_models,
model_types=model_types,
tags=tags,
favorites_only=favorites_only,
search_options=normalized_options,
@@ -170,6 +192,61 @@ class BaseModelService(ABC):
"""Apply model-specific filters - to be overridden by subclasses if needed"""
return data
async def _apply_credit_required_filter(self, data: List[Dict], credit_required: bool) -> List[Dict]:
"""Apply credit required filtering based on license_flags.
Args:
data: List of model data items
credit_required:
- True: Return items where credit is required (allowNoCredit=False)
- False: Return items where credit is not required (allowNoCredit=True)
"""
filtered_data = []
for item in data:
license_flags = item.get("license_flags", 127) # Default to all permissions enabled
# Bit 0 represents allowNoCredit (1 = no credit required, 0 = credit required)
allow_no_credit = bool(license_flags & (1 << 0))
# If credit_required is True, we want items where allowNoCredit is False (credit required)
# If credit_required is False, we want items where allowNoCredit is True (no credit required)
if credit_required:
if not allow_no_credit: # Credit is required
filtered_data.append(item)
else:
if allow_no_credit: # Credit is not required
filtered_data.append(item)
return filtered_data
async def _apply_allow_selling_filter(self, data: List[Dict], allow_selling: bool) -> List[Dict]:
"""Apply allow selling generated content filtering based on license_flags.
Args:
data: List of model data items
allow_selling:
- True: Return items where selling generated content is allowed (allowCommercialUse contains Image)
- False: Return items where selling generated content is not allowed (allowCommercialUse does not contain Image)
"""
filtered_data = []
for item in data:
license_flags = item.get("license_flags", 127) # Default to all permissions enabled
# Bits 1-4 represent commercial use permissions
# Bit 1 specifically represents Image permission (allowCommercialUse contains Image)
has_image_permission = bool(license_flags & (1 << 1))
# If allow_selling is True, we want items where Image permission is granted
# If allow_selling is False, we want items where Image permission is not granted
if allow_selling:
if has_image_permission: # Selling generated content is allowed
filtered_data.append(item)
else:
if not has_image_permission: # Selling generated content is not allowed
filtered_data.append(item)
return filtered_data
async def _annotate_update_flags(
self,
items: List[Dict],
@@ -203,20 +280,49 @@ class BaseModelService(ABC):
if not ordered_ids:
return annotated
strategy_value = self.settings.get("update_flag_strategy")
if isinstance(strategy_value, str) and strategy_value.strip():
strategy = strategy_value.strip().lower()
else:
strategy = "same_base"
same_base_mode = strategy == "same_base"
records = None
resolved: Optional[Dict[int, bool]] = None
bulk_method = getattr(self.update_service, "has_updates_bulk", None)
if callable(bulk_method):
try:
resolved = await bulk_method(self.model_type, ordered_ids)
except Exception as exc:
logger.error(
"Failed to resolve update status in bulk for %s models (%s): %s",
self.model_type,
ordered_ids,
exc,
exc_info=True,
)
resolved = None
if same_base_mode:
record_method = getattr(self.update_service, "get_records_bulk", None)
if callable(record_method):
try:
records = await record_method(self.model_type, ordered_ids)
resolved = {
model_id: record.has_update()
for model_id, record in records.items()
}
except Exception as exc:
logger.error(
"Failed to resolve update records in bulk for %s models (%s): %s",
self.model_type,
ordered_ids,
exc,
exc_info=True,
)
records = None
resolved = None
if resolved is None:
bulk_method = getattr(self.update_service, "has_updates_bulk", None)
if callable(bulk_method):
try:
resolved = await bulk_method(self.model_type, ordered_ids)
except Exception as exc:
logger.error(
"Failed to resolve update status in bulk for %s models (%s): %s",
self.model_type,
ordered_ids,
exc,
exc_info=True,
)
resolved = None
if resolved is None:
tasks = [
@@ -237,8 +343,24 @@ class BaseModelService(ABC):
resolved[model_id] = bool(result)
for model_id, items_for_id in id_to_items.items():
flag = bool(resolved.get(model_id, False))
default_flag = bool(resolved.get(model_id, False)) if resolved else False
record = records.get(model_id) if records else None
base_highest_versions = (
self._build_highest_local_versions_by_base(record) if same_base_mode and record else {}
)
for item in items_for_id:
if same_base_mode and record is not None:
base_model = self._extract_base_model(item)
normalized_base = self._normalize_base_model_name(base_model)
threshold_version = base_highest_versions.get(normalized_base) if normalized_base else None
if threshold_version is None:
threshold_version = self._extract_version_id(item)
flag = record.has_update_for_base(
threshold_version,
base_model,
)
else:
flag = default_flag
item['update_available'] = flag
return annotated
@@ -255,7 +377,71 @@ class BaseModelService(ABC):
return int(value)
except (TypeError, ValueError):
return None
@staticmethod
def _extract_version_id(item: Dict) -> Optional[int]:
civitai = item.get('civitai') if isinstance(item, dict) else None
if not isinstance(civitai, dict):
return None
value = civitai.get('id')
if value is None:
return None
try:
return int(value)
except (TypeError, ValueError):
return None
@staticmethod
def _extract_base_model(item: Dict) -> Optional[str]:
value = item.get('base_model')
if value is None:
return None
if isinstance(value, str):
candidate = value.strip()
else:
try:
candidate = str(value).strip()
except Exception:
return None
return candidate if candidate else None
@staticmethod
def _normalize_base_model_name(value: Optional[str]) -> Optional[str]:
"""Return a lowercased, trimmed base model name for comparison."""
if value is None:
return None
if isinstance(value, str):
candidate = value.strip()
else:
try:
candidate = str(value).strip()
except Exception:
return None
return candidate.lower() if candidate else None
def _build_highest_local_versions_by_base(self, record) -> Dict[str, int]:
"""Return the highest local version id known for each normalized base model."""
if record is None:
return {}
highest_by_base: Dict[str, int] = {}
for version in getattr(record, "versions", []):
if not getattr(version, "is_in_library", False):
continue
normalized_base = self._normalize_base_model_name(getattr(version, "base_model", None))
if normalized_base is None:
continue
version_id = getattr(version, "version_id", None)
if version_id is None:
continue
current_max = highest_by_base.get(normalized_base)
if current_max is None or version_id > current_max:
highest_by_base[normalized_base] = version_id
return highest_by_base
def _paginate(self, data: List[Dict], page: int, page_size: int) -> Dict:
"""Apply pagination to filtered data"""
total_items = len(data)
@@ -283,6 +469,25 @@ class BaseModelService(ABC):
async def get_base_models(self, limit: int = 20) -> List[Dict]:
"""Get base models sorted by frequency"""
return await self.scanner.get_base_models(limit)
async def get_model_types(self, limit: int = 20) -> List[Dict[str, Any]]:
"""Get counts of normalized CivitAI model types present in the cache."""
cache = await self.scanner.get_cached_data()
type_counts: Dict[str, int] = {}
for entry in cache.raw_data:
normalized_type = normalize_civitai_model_type(resolve_civitai_model_type(entry))
if not normalized_type or normalized_type not in VALID_LORA_TYPES:
continue
type_counts[normalized_type] = type_counts.get(normalized_type, 0) + 1
sorted_types = sorted(
[{"type": model_type, "count": count} for model_type, count in type_counts.items()],
key=lambda value: value["count"],
reverse=True,
)
return sorted_types[:limit]
def has_hash(self, sha256: str) -> bool:
"""Check if a model with given hash exists"""
@@ -443,13 +648,55 @@ class BaseModelService(ABC):
return None
return metadata.modelDescription or ''
@staticmethod
def _parse_search_tokens(search_term: str) -> tuple[List[str], List[str]]:
"""Split a search string into include and exclude tokens."""
include_terms: List[str] = []
exclude_terms: List[str] = []
for raw_term in search_term.split():
term = raw_term.strip()
if not term:
continue
if term.startswith("-") and len(term) > 1:
exclude_terms.append(term[1:].lower())
else:
include_terms.append(term.lower())
return include_terms, exclude_terms
@staticmethod
def _relative_path_matches_tokens(
path_lower: str, include_terms: List[str], exclude_terms: List[str]
) -> bool:
"""Determine whether a relative path string satisfies include/exclude tokens."""
if any(term and term in path_lower for term in exclude_terms):
return False
for term in include_terms:
if term and term not in path_lower:
return False
return True
@staticmethod
def _relative_path_sort_key(relative_path: str, include_terms: List[str]) -> tuple:
"""Sort paths by how well they satisfy the include tokens."""
path_lower = relative_path.lower()
prefix_hits = sum(1 for term in include_terms if term and path_lower.startswith(term))
match_positions = [path_lower.find(term) for term in include_terms if term and term in path_lower]
first_match_index = min(match_positions) if match_positions else 0
return (-prefix_hits, first_match_index, len(relative_path), path_lower)
async def search_relative_paths(self, search_term: str, limit: int = 15) -> List[str]:
"""Search model relative file paths for autocomplete functionality"""
cache = await self.scanner.get_cached_data()
include_terms, exclude_terms = self._parse_search_tokens(search_term)
matching_paths = []
search_lower = search_term.lower()
# Get model roots for path calculation
model_roots = self.scanner.get_model_roots()
@@ -471,17 +718,19 @@ class BaseModelService(ABC):
relative_path = normalized_file[len(normalized_root):].lstrip(os.sep)
break
if relative_path and search_lower in relative_path.lower():
if not relative_path:
continue
relative_lower = relative_path.lower()
if self._relative_path_matches_tokens(relative_lower, include_terms, exclude_terms):
matching_paths.append(relative_path)
if len(matching_paths) >= limit * 2: # Get more for better sorting
break
# Sort by relevance (exact matches first, then by length)
matching_paths.sort(key=lambda x: (
not x.lower().startswith(search_lower), # Exact prefix matches first
len(x), # Then by length (shorter first)
x.lower() # Then alphabetically
))
# Sort by relevance (prefix and earliest hits first, then by length and alphabetically)
matching_paths.sort(
key=lambda relative: self._relative_path_sort_key(relative, include_terms)
)
return matching_paths[:limit]

View File

@@ -1,4 +1,3 @@
import os
import json
import logging
import asyncio
@@ -8,22 +7,6 @@ from .model_metadata_provider import CivArchiveModelMetadataProvider, ModelMetad
from .downloader import get_downloader
from .errors import RateLimitError
try:
from bs4 import BeautifulSoup
except ImportError as exc:
BeautifulSoup = None # type: ignore[assignment]
_BS4_IMPORT_ERROR = exc
else:
_BS4_IMPORT_ERROR = None
def _require_beautifulsoup():
if BeautifulSoup is None:
raise RuntimeError(
"BeautifulSoup (bs4) is required for CivArchive client. "
"Install it with 'pip install beautifulsoup4'."
) from _BS4_IMPORT_ERROR
return BeautifulSoup
logger = logging.getLogger(__name__)
class CivArchiveClient:
@@ -446,109 +429,3 @@ class CivArchiveClient:
if version is None:
return None, "Model not found"
return version, None
async def get_model_by_url(self, url) -> Optional[Dict]:
"""Get specific model version by parsing CivArchive HTML page (legacy method)
This is the original HTML scraping implementation, kept for reference and new sites added not in api.
The primary get_model_version() now uses the API instead.
"""
try:
# Construct CivArchive URL
url = f"https://civarchive.com/{url}"
downloader = await get_downloader()
session = await downloader.session
async with session.get(url) as response:
if response.status != 200:
return None
html_content = await response.text()
# Parse HTML to extract JSON data
soup_parser = _require_beautifulsoup()
soup = soup_parser(html_content, 'html.parser')
script_tag = soup.find('script', {'id': '__NEXT_DATA__', 'type': 'application/json'})
if not script_tag:
return None
# Parse JSON content
json_data = json.loads(script_tag.string)
model_data = json_data.get('props', {}).get('pageProps', {}).get('model')
if not model_data or 'version' not in model_data:
return None
# Extract version data as base
version = model_data['version'].copy()
# Restructure stats
if 'downloadCount' in version and 'ratingCount' in version and 'rating' in version:
version['stats'] = {
'downloadCount': version.pop('downloadCount'),
'ratingCount': version.pop('ratingCount'),
'rating': version.pop('rating')
}
# Rename trigger to trainedWords
if 'trigger' in version:
version['trainedWords'] = version.pop('trigger')
# Transform files data to expected format
if 'files' in version:
transformed_files = []
for file_data in version['files']:
# Find first available mirror (deletedAt is null)
available_mirror = None
for mirror in file_data.get('mirrors', []):
if mirror.get('deletedAt') is None:
available_mirror = mirror
break
# Create transformed file entry
transformed_file = {
'id': file_data.get('id'),
'sizeKB': file_data.get('sizeKB'),
'name': available_mirror.get('filename', file_data.get('name')) if available_mirror else file_data.get('name'),
'type': file_data.get('type'),
'downloadUrl': available_mirror.get('url') if available_mirror else None,
'primary': file_data.get('is_primary', False),
'mirrors': file_data.get('mirrors', [])
}
# Transform hash format
if 'sha256' in file_data:
transformed_file['hashes'] = {
'SHA256': file_data['sha256'].upper()
}
transformed_files.append(transformed_file)
version['files'] = transformed_files
# Add model information
version['model'] = {
'name': model_data.get('name'),
'type': model_data.get('type'),
'nsfw': model_data.get('is_nsfw', False),
'description': model_data.get('description'),
'tags': model_data.get('tags', [])
}
version['creator'] = {
'username': model_data.get('username'),
'image': ''
}
# Add source identifier
version['source'] = 'civarchive'
version['is_deleted'] = json_data.get('query', {}).get('is_deleted', False)
return version
except RateLimitError:
raise
except Exception as e:
logger.error(f"Error fetching CivArchive model version (scraping) {url}: {e}")
return None

View File

@@ -6,6 +6,7 @@ from typing import Any, Optional, Dict, Tuple, List, Sequence
from .model_metadata_provider import CivitaiModelMetadataProvider, ModelMetadataProviderManager
from .downloader import get_downloader
from .errors import RateLimitError, ResourceNotFoundError
from ..utils.civitai_utils import resolve_license_payload
logger = logging.getLogger(__name__)
@@ -103,44 +104,32 @@ class CivitaiClient:
async def get_model_by_hash(self, model_hash: str) -> Tuple[Optional[Dict], Optional[str]]:
try:
success, result = await self._make_request(
success, version = await self._make_request(
'GET',
f"{self.base_url}/model-versions/by-hash/{model_hash}",
use_auth=True
)
if success:
# Get model ID from version data
model_id = result.get('modelId')
if model_id:
# Fetch additional model metadata
success_model, data = await self._make_request(
'GET',
f"{self.base_url}/models/{model_id}",
use_auth=True
)
if success_model:
# Enrich version_info with model data
result['model']['description'] = data.get("description")
result['model']['tags'] = data.get("tags", [])
if not success:
message = str(version)
if "not found" in message.lower():
return None, "Model not found"
# Add creator from model data
result['creator'] = data.get("creator")
logger.error("Failed to fetch model info for %s: %s", model_hash[:10], message)
return None, message
self._remove_comfy_metadata(result)
return result, None
# Handle specific error cases
if "not found" in str(result):
return None, "Model not found"
# Other error cases
logger.error(f"Failed to fetch model info for {model_hash[:10]}: {result}")
return None, str(result)
model_id = version.get('modelId')
if model_id:
model_data = await self._fetch_model_data(model_id)
if model_data:
self._enrich_version_with_model_data(version, model_data)
self._remove_comfy_metadata(version)
return version, None
except RateLimitError:
raise
except Exception as e:
logger.error(f"API Error: {str(e)}")
return None, str(e)
except Exception as exc:
logger.error("API Error: %s", exc)
return None, str(exc)
async def download_preview_image(self, image_url: str, save_path: str):
try:
@@ -257,6 +246,10 @@ class CivitaiClient:
'modelVersions': item.get('modelVersions', []),
'type': item.get('type', ''),
'name': item.get('name', ''),
'allowNoCredit': item.get('allowNoCredit'),
'allowCommercialUse': item.get('allowCommercialUse'),
'allowDerivatives': item.get('allowDerivatives'),
'allowDifferentLicense': item.get('allowDifferentLicense'),
}
return payload
except RateLimitError:
@@ -420,6 +413,10 @@ class CivitaiClient:
model_info['tags'] = model_data.get("tags", [])
version['creator'] = model_data.get("creator")
license_payload = resolve_license_payload(model_data)
for field, value in license_payload.items():
model_info[field] = value
async def get_model_version_info(self, version_id: str) -> Tuple[Optional[Dict], Optional[str]]:
"""Fetch model version metadata from Civitai

View File

@@ -1,7 +1,10 @@
import copy
import logging
import os
import asyncio
import inspect
import shutil
import zipfile
from collections import OrderedDict
import uuid
from typing import Dict, List, Optional, Tuple
@@ -12,6 +15,7 @@ from ..utils.civitai_utils import rewrite_preview_url
from ..utils.preview_selection import select_preview_media
from ..utils.utils import sanitize_folder_name
from ..utils.exif_utils import ExifUtils
from ..utils.file_utils import calculate_sha256
from ..utils.metadata_manager import MetadataManager
from .service_registry import ServiceRegistry
from .settings_manager import get_settings_manager
@@ -331,7 +335,7 @@ class DownloadManager:
await progress_callback(0)
# 2. Get file information
file_info = next((f for f in version_info.get('files', []) if f.get('primary') and f.get('type') == 'Model'), None)
file_info = next((f for f in version_info.get('files', []) if f.get('primary') and f.get('type') in ('Model', 'Negative')), None)
if not file_info:
return {'success': False, 'error': 'No primary file found in metadata'}
mirrors = file_info.get('mirrors') or []
@@ -556,6 +560,13 @@ class DownloadManager:
download_id: str = None,
) -> Dict:
"""Execute the actual download process including preview images and model files"""
metadata_entries: List = []
metadata_files_for_cleanup: List[str] = []
extracted_paths: List[str] = []
metadata_path = ""
preview_targets: List[str] = []
preview_path: str | None = None
preview_nsfw_level = 0
try:
# Extract original filename details
original_filename = os.path.basename(metadata.file_path)
@@ -699,10 +710,9 @@ class DownloadManager:
logger.warning(f"Failed to delete temp file: {e}")
if preview_downloaded and preview_path:
preview_nsfw_level = nsfw_level
metadata.preview_url = preview_path.replace(os.sep, '/')
metadata.preview_nsfw_level = nsfw_level
if download_id and download_id in self._active_downloads:
self._active_downloads[download_id]['preview_path'] = preview_path
if progress_callback:
await progress_callback(3) # 3% progress after preview download
@@ -761,77 +771,189 @@ class DownloadManager:
return {'success': False, 'error': last_error or 'Failed to download file'}
# 4. Update file information (size and modified time)
metadata.update_file_info(save_path)
# 4. Handle archive extraction and prepare per-file metadata
actual_file_paths = [save_path]
if zipfile.is_zipfile(save_path):
extracted_paths = await self._extract_safetensors_from_archive(save_path)
if not extracted_paths:
return {'success': False, 'error': 'Zip archive does not contain any safetensors files'}
actual_file_paths = extracted_paths
try:
os.remove(save_path)
except OSError as exc:
logger.warning(f"Unable to delete temporary archive {save_path}: {exc}")
if download_id and download_id in self._active_downloads:
self._active_downloads[download_id]['file_path'] = extracted_paths[0]
self._active_downloads[download_id]['extracted_paths'] = extracted_paths
metadata_entries = await self._build_metadata_entries(metadata, actual_file_paths)
if preview_path:
preview_targets = self._distribute_preview_to_entries(preview_path, metadata_entries)
for entry, target in zip(metadata_entries, preview_targets):
entry.preview_url = target.replace(os.sep, "/")
entry.preview_nsfw_level = preview_nsfw_level
if download_id and download_id in self._active_downloads and preview_targets:
self._active_downloads[download_id]["preview_path"] = preview_targets[0]
scanner = None
adjust_root: Optional[str] = None
# 5. Determine scanner and adjust metadata for cache consistency
if model_type == "checkpoint":
scanner = await self._get_checkpoint_scanner()
logger.info(f"Updating checkpoint cache for {save_path}")
logger.info(f"Updating checkpoint cache for {actual_file_paths[0]}")
elif model_type == "lora":
scanner = await self._get_lora_scanner()
logger.info(f"Updating lora cache for {save_path}")
logger.info(f"Updating lora cache for {actual_file_paths[0]}")
elif model_type == "embedding":
scanner = await ServiceRegistry.get_embedding_scanner()
logger.info(f"Updating embedding cache for {save_path}")
logger.info(f"Updating embedding cache for {actual_file_paths[0]}")
if scanner is not None:
file_path_for_adjust = getattr(metadata, "file_path", save_path)
if isinstance(file_path_for_adjust, str):
normalized_file_path = file_path_for_adjust.replace(os.sep, "/")
else:
normalized_file_path = str(file_path_for_adjust)
adjust_cached_entry = (
getattr(scanner, "adjust_cached_entry", None) if scanner is not None else None
)
find_root = getattr(scanner, "_find_root_for_file", None)
if callable(find_root):
try:
adjust_root = find_root(normalized_file_path)
except TypeError:
adjust_root = None
for index, entry in enumerate(metadata_entries):
file_path_for_adjust = getattr(entry, "file_path", actual_file_paths[index])
normalized_file_path = (
file_path_for_adjust.replace(os.sep, "/")
if isinstance(file_path_for_adjust, str)
else str(file_path_for_adjust)
)
adjust_metadata = getattr(scanner, "adjust_metadata", None)
if callable(adjust_metadata):
metadata = adjust_metadata(metadata, normalized_file_path, adjust_root)
if scanner is not None:
find_root = getattr(scanner, "_find_root_for_file", None)
adjust_root = None
if callable(find_root):
try:
adjust_root = find_root(normalized_file_path)
except TypeError:
adjust_root = None
# 6. Persist metadata with any adjustments
await MetadataManager.save_metadata(save_path, metadata)
adjust_metadata = getattr(scanner, "adjust_metadata", None)
if callable(adjust_metadata):
adjusted_entry = adjust_metadata(entry, normalized_file_path, adjust_root)
if adjusted_entry is not None:
entry = adjusted_entry
metadata_entries[index] = entry
# Convert metadata to dictionary
metadata_dict = metadata.to_dict()
adjust_cached_entry = getattr(scanner, "adjust_cached_entry", None) if scanner is not None else None
if callable(adjust_cached_entry):
metadata_dict = adjust_cached_entry(metadata_dict)
metadata_file_path = os.path.splitext(entry.file_path)[0] + '.metadata.json'
metadata_files_for_cleanup.append(metadata_file_path)
# Add model to cache and save to disk in a single operation
await scanner.add_model_to_cache(metadata_dict, relative_path)
await MetadataManager.save_metadata(entry.file_path, entry)
metadata_dict = entry.to_dict()
if callable(adjust_cached_entry):
metadata_dict = adjust_cached_entry(metadata_dict)
if scanner is not None:
await scanner.add_model_to_cache(metadata_dict, relative_path)
# Report 100% completion
if progress_callback:
await progress_callback(100)
return {
'success': True
}
return {'success': True}
except Exception as e:
logger.error(f"Error in _execute_download: {e}", exc_info=True)
# Clean up partial downloads except .part file
cleanup_files = [metadata_path]
if hasattr(metadata, 'preview_url') and metadata.preview_url and os.path.exists(metadata.preview_url):
cleanup_files.append(metadata.preview_url)
for path in cleanup_files:
cleanup_targets = {
path
for path in [save_path, metadata_path, *metadata_files_for_cleanup, *extracted_paths]
if path
}
preview_candidate = (
metadata_entries[0].preview_url
if metadata_entries
else getattr(metadata, "preview_url", None)
)
if preview_candidate:
cleanup_targets.add(preview_candidate)
cleanup_targets.update(preview_targets)
for path in cleanup_targets:
if path and os.path.exists(path):
try:
os.remove(path)
except Exception as e:
logger.warning(f"Failed to cleanup file {path}: {e}")
except Exception as exc:
logger.warning(f"Failed to cleanup file {path}: {exc}")
return {'success': False, 'error': str(e)}
async def _extract_safetensors_from_archive(self, archive_path: str) -> List[str]:
if not zipfile.is_zipfile(archive_path):
return []
target_dir = os.path.dirname(archive_path)
def _extract_sync() -> List[str]:
extracted_files: List[str] = []
with zipfile.ZipFile(archive_path, "r") as archive:
for info in archive.infolist():
if info.is_dir():
continue
if not info.filename.lower().endswith(".safetensors"):
continue
file_name = os.path.basename(info.filename)
if not file_name:
continue
dest_path = self._resolve_extracted_destination(target_dir, file_name)
with archive.open(info) as source, open(dest_path, "wb") as target:
shutil.copyfileobj(source, target)
extracted_files.append(dest_path)
return extracted_files
return await asyncio.to_thread(_extract_sync)
async def _build_metadata_entries(self, base_metadata, file_paths: List[str]) -> List:
if not file_paths:
return []
entries: List = []
for index, file_path in enumerate(file_paths):
entry = base_metadata if index == 0 else copy.deepcopy(base_metadata)
entry.update_file_info(file_path)
entry.sha256 = await calculate_sha256(file_path)
entries.append(entry)
return entries
def _resolve_extracted_destination(self, target_dir: str, filename: str) -> str:
base_name, extension = os.path.splitext(filename)
candidate = filename
destination = os.path.join(target_dir, candidate)
counter = 1
while os.path.exists(destination):
candidate = f"{base_name}-{counter}{extension}"
destination = os.path.join(target_dir, candidate)
counter += 1
return destination
def _distribute_preview_to_entries(self, preview_path: str, entries: List) -> List[str]:
if not preview_path or not entries:
return []
if not os.path.exists(preview_path):
return []
extension = os.path.splitext(preview_path)[1] or ".webp"
targets = [
os.path.splitext(entry.file_path)[0] + extension for entry in entries
]
if not targets:
return []
first_target = targets[0]
if preview_path != first_target:
os.replace(preview_path, first_target)
source_path = first_target
for target in targets[1:]:
shutil.copyfile(source_path, target)
return targets
async def _handle_download_progress(
self,
progress_update,
@@ -895,16 +1017,23 @@ class DownloadManager:
# Clean up ALL files including .part when user cancels
download_info = self._active_downloads.get(download_id)
if download_info:
# Delete the main file
if 'file_path' in download_info:
file_path = download_info['file_path']
target_files = set()
primary_path = download_info.get('file_path')
if primary_path:
target_files.add(primary_path)
for extra_path in download_info.get('extracted_paths', []):
if extra_path:
target_files.add(extra_path)
for file_path in target_files:
if os.path.exists(file_path):
try:
os.unlink(file_path)
logger.debug(f"Deleted cancelled download: {file_path}")
except Exception as e:
logger.error(f"Error deleting file: {e}")
# Delete the .part file (only on user cancellation)
if 'part_path' in download_info:
part_path = download_info['part_path']
@@ -914,10 +1043,9 @@ class DownloadManager:
logger.debug(f"Deleted partial download: {part_path}")
except Exception as e:
logger.error(f"Error deleting part file: {e}")
# Delete metadata file if exists
if 'file_path' in download_info:
file_path = download_info['file_path']
# Delete metadata files for each resolved path
for file_path in target_files:
metadata_path = os.path.splitext(file_path)[0] + '.metadata.json'
if os.path.exists(metadata_path):
try:
@@ -925,15 +1053,16 @@ class DownloadManager:
except Exception as e:
logger.error(f"Error deleting metadata file: {e}")
preview_path_value = download_info.get('preview_path')
if preview_path_value and os.path.exists(preview_path_value):
try:
os.unlink(preview_path_value)
logger.debug(f"Deleted preview file: {preview_path_value}")
except Exception as e:
logger.error(f"Error deleting preview file: {e}")
preview_path_value = download_info.get('preview_path')
if preview_path_value and os.path.exists(preview_path_value):
try:
os.unlink(preview_path_value)
logger.debug(f"Deleted preview file: {preview_path_value}")
except Exception as e:
logger.error(f"Error deleting preview file: {preview_path_value}")
# Delete preview file if exists (.webp or .mp4) for legacy paths
# Delete preview file if exists (.webp or .mp4) for legacy paths
for file_path in target_files:
for preview_ext in ['.webp', '.mp4']:
preview_path = os.path.splitext(file_path)[0] + preview_ext
if os.path.exists(preview_path):
@@ -941,8 +1070,7 @@ class DownloadManager:
os.unlink(preview_path)
logger.debug(f"Deleted preview file: {preview_path}")
except Exception as e:
logger.error(f"Error deleting preview file: {e}")
logger.error(f"Error deleting preview file: {preview_path}")
return {'success': True, 'message': 'Download cancelled successfully'}
except Exception as e:
logger.error(f"Error cancelling download: {e}", exc_info=True)

View File

@@ -2,11 +2,12 @@ import os
import logging
from .model_metadata_provider import (
ModelMetadataProvider,
ModelMetadataProviderManager,
ModelMetadataProviderManager,
SQLiteModelMetadataProvider,
CivitaiModelMetadataProvider,
CivArchiveModelMetadataProvider,
FallbackMetadataProvider
FallbackMetadataProvider,
RateLimitRetryingProvider,
)
from .settings_manager import get_settings_manager
from .metadata_archive_manager import MetadataArchiveManager
@@ -108,14 +109,24 @@ async def get_metadata_archive_manager():
base_path = os.path.dirname(os.path.dirname(os.path.dirname(__file__)))
return MetadataArchiveManager(base_path)
def _wrap_provider_with_rate_limit(provider_name: str | None, provider: ModelMetadataProvider) -> ModelMetadataProvider:
if isinstance(provider, (FallbackMetadataProvider, RateLimitRetryingProvider)):
return provider
return RateLimitRetryingProvider(provider, label=provider_name)
async def get_metadata_provider(provider_name: str = None):
"""Get a specific metadata provider or default provider"""
"""Get a specific metadata provider or default provider with rate-limit handling."""
provider_manager = await ModelMetadataProviderManager.get_instance()
if provider_name:
return provider_manager._get_provider(provider_name)
return provider_manager._get_provider()
provider = (
provider_manager._get_provider(provider_name)
if provider_name
else provider_manager._get_provider()
)
return _wrap_provider_with_rate_limit(provider_name, provider)
async def get_default_metadata_provider():
"""Get the default metadata provider (fallback or single provider)"""

View File

@@ -9,6 +9,7 @@ from datetime import datetime
from typing import Any, Awaitable, Callable, Dict, Iterable, Optional
from ..services.settings_manager import SettingsManager
from ..utils.civitai_utils import resolve_license_payload
from ..utils.model_utils import determine_base_model
from .errors import RateLimitError
@@ -135,6 +136,17 @@ class MetadataSyncService:
):
local_metadata.setdefault("civitai", {})["creator"] = model_data["creator"]
merged_civitai = local_metadata.get("civitai") or {}
civitai_model = merged_civitai.get("model")
if not isinstance(civitai_model, dict):
civitai_model = {}
license_payload = resolve_license_payload(model_data)
civitai_model.update(license_payload)
merged_civitai["model"] = civitai_model
local_metadata["civitai"] = merged_civitai
local_metadata["base_model"] = determine_base_model(
civitai_metadata.get("baseModel")
)
@@ -202,6 +214,7 @@ class MetadataSyncService:
metadata_provider: Optional[MetadataProviderProtocol] = None
provider_used: Optional[str] = None
last_error: Optional[str] = None
civitai_api_not_found = False
for provider_name, provider in provider_attempts:
try:
@@ -216,19 +229,24 @@ class MetadataSyncService:
if provider_name == "sqlite":
sqlite_attempted = True
is_default_provider = provider_name is None
if civitai_metadata_candidate:
civitai_metadata = civitai_metadata_candidate
metadata_provider = provider
provider_used = provider_name
break
if is_default_provider and error == "Model not found":
civitai_api_not_found = True
last_error = error or last_error
if civitai_metadata is None or metadata_provider is None:
if sqlite_attempted:
model_data["db_checked"] = True
if last_error == "Model not found":
if civitai_api_not_found:
model_data["from_civitai"] = False
model_data["civitai_deleted"] = True
model_data["db_checked"] = sqlite_attempted or (enable_archive and model_data.get("db_checked", False))
@@ -254,7 +272,10 @@ class MetadataSyncService:
return False, error_msg
model_data["from_civitai"] = True
model_data["civitai_deleted"] = civitai_metadata.get("source") == "archive_db" or civitai_metadata.get("source") == "civarchive"
if provider_used is None:
model_data["civitai_deleted"] = False
elif civitai_api_not_found:
model_data["civitai_deleted"] = True
model_data["db_checked"] = enable_archive and (
civitai_metadata.get("source") == "archive_db" or sqlite_attempted
)
@@ -295,6 +316,7 @@ class MetadataSyncService:
"preview_url": local_metadata.get("preview_url"),
"civitai": local_metadata.get("civitai"),
}
model_data.update(update_payload)
await update_cache_func(file_path, file_path, local_metadata)
@@ -436,4 +458,3 @@ class MetadataSyncService:
results["verified_as_duplicates"] = False
return results

View File

@@ -1,7 +1,8 @@
import asyncio
import fnmatch
import os
import logging
from typing import List, Dict, Optional, Any, Set
from typing import Any, Dict, List, Optional, Sequence, Set
from abc import ABC, abstractmethod
from ..utils.utils import calculate_relative_path_for_model, remove_empty_dirs
@@ -79,9 +80,10 @@ class ModelFileService:
return self.scanner.get_model_roots()
async def auto_organize_models(
self,
self,
file_paths: Optional[List[str]] = None,
progress_callback: Optional[ProgressCallback] = None
progress_callback: Optional[ProgressCallback] = None,
exclusion_patterns: Optional[Sequence[str]] = None,
) -> AutoOrganizeResult:
"""Auto-organize models based on current settings
@@ -100,6 +102,13 @@ class ModelFileService:
# Get all models from cache
cache = await self.scanner.get_cached_data()
all_models = cache.raw_data
settings_manager = get_settings_manager()
normalized_exclusions = settings_manager.normalize_auto_organize_exclusions(
exclusion_patterns
if exclusion_patterns is not None
else settings_manager.get_auto_organize_exclusions()
)
# Filter models if specific file paths are provided
if file_paths:
@@ -107,11 +116,19 @@ class ModelFileService:
result.operation_type = 'bulk'
else:
result.operation_type = 'all'
# Get model roots for this scanner
model_roots = self.get_model_roots()
if not model_roots:
raise ValueError('No model roots configured')
if normalized_exclusions:
all_models = [
model
for model in all_models
if not self._should_exclude_model(
model.get('file_path'), normalized_exclusions, model_roots
)
]
# Check if flat structure is configured for this model type
settings_manager = get_settings_manager()
@@ -133,7 +150,34 @@ class ModelFileService:
'skipped': 0,
'operation_type': result.operation_type
})
if result.total == 0:
if progress_callback:
await asyncio.sleep(0.1)
payload = {
'type': 'auto_organize_progress',
'total': 0,
'processed': 0,
'success': 0,
'failures': 0,
'skipped': 0,
'operation_type': result.operation_type
}
await progress_callback.on_progress({**payload, 'status': 'processing'})
await progress_callback.on_progress({
**payload,
'status': 'cleaning',
'message': 'Cleaning up empty directories...'
})
result.cleanup_counts = {}
await progress_callback.on_progress({
**payload,
'status': 'completed',
'cleanup': result.cleanup_counts
})
return result
# Process models in batches
await self._process_models_in_batches(
all_models,
@@ -301,10 +345,43 @@ class ModelFileService:
# Normalize paths for comparison
normalized_root = os.path.normpath(root).replace(os.sep, '/')
normalized_file = os.path.normpath(file_path).replace(os.sep, '/')
if normalized_file.startswith(normalized_root):
return root
return None
def _should_exclude_model(
self,
file_path: Optional[str],
patterns: Sequence[str],
model_roots: Sequence[str],
) -> bool:
if not file_path or not patterns:
return False
normalized_path = os.path.normpath(file_path).replace(os.sep, '/')
filename = os.path.basename(normalized_path)
relative_path = None
if model_roots:
root = self._find_model_root(file_path, list(model_roots))
if root:
normalized_root = os.path.normpath(root)
try:
relative = os.path.relpath(file_path, normalized_root)
except ValueError:
relative = None
if relative is not None:
relative_path = relative.replace(os.sep, '/')
for pattern in patterns:
if fnmatch.fnmatch(filename, pattern):
return True
if relative_path and fnmatch.fnmatch(relative_path, pattern):
return True
if fnmatch.fnmatch(normalized_path, pattern):
return True
return False
async def _calculate_target_directory(
self,
@@ -461,4 +538,4 @@ class ModelMoveService:
'results': [],
'success_count': 0,
'failure_count': len(file_paths)
}
}

View File

@@ -4,26 +4,29 @@ from __future__ import annotations
import logging
import os
from typing import Awaitable, Callable, Dict, Iterable, List, Optional
from typing import Any, Awaitable, Callable, Dict, Iterable, List, Mapping, Optional, TYPE_CHECKING
from ..services.service_registry import ServiceRegistry
from ..utils.constants import PREVIEW_EXTENSIONS
logger = logging.getLogger(__name__)
if TYPE_CHECKING:
from ..services.model_update_service import ModelUpdateService
async def delete_model_artifacts(target_dir: str, file_name: str) -> List[str]:
async def delete_model_artifacts(
target_dir: str, file_name: str, main_extension: str | None = None
) -> List[str]:
"""Delete the primary model artefacts within ``target_dir``."""
patterns = [
f"{file_name}.safetensors",
f"{file_name}.metadata.json",
]
main_extension = ".safetensors" if main_extension is None else main_extension
main_file = f"{file_name}{main_extension}" if main_extension else file_name
patterns = [main_file, f"{file_name}.metadata.json"]
for ext in PREVIEW_EXTENSIONS:
patterns.append(f"{file_name}{ext}")
deleted: List[str] = []
main_file = patterns[0]
main_path = os.path.join(target_dir, main_file).replace(os.sep, "/")
if os.path.exists(main_path):
@@ -54,6 +57,7 @@ class ModelLifecycleService:
metadata_manager,
metadata_loader: Callable[[str], Awaitable[Dict[str, object]]],
recipe_scanner_factory: Callable[[], Awaitable] | None = None,
update_service: "ModelUpdateService" | None = None,
) -> None:
self._scanner = scanner
self._metadata_manager = metadata_manager
@@ -61,6 +65,7 @@ class ModelLifecycleService:
self._recipe_scanner_factory = (
recipe_scanner_factory or ServiceRegistry.get_recipe_scanner
)
self._update_service = update_service
async def delete_model(self, file_path: str) -> Dict[str, object]:
"""Delete a model file and associated artefacts."""
@@ -68,20 +73,103 @@ class ModelLifecycleService:
if not file_path:
raise ValueError("Model path is required")
target_dir = os.path.dirname(file_path)
file_name = os.path.splitext(os.path.basename(file_path))[0]
deleted_files = await delete_model_artifacts(target_dir, file_name)
cache = await self._scanner.get_cached_data()
cache.raw_data = [item for item in cache.raw_data if item["file_path"] != file_path]
await cache.resort()
cached_entry = None
if cache and hasattr(cache, "raw_data"):
cached_entry = next(
(item for item in cache.raw_data if item.get("file_path") == file_path),
None,
)
metadata_payload = {}
try:
metadata_payload = await self._metadata_manager.load_metadata_payload(file_path)
except Exception as exc: # pragma: no cover - defensive guard
logger.debug("Failed to load metadata payload for %s: %s", file_path, exc)
model_id = (
self._extract_model_id_from_payload(metadata_payload)
or self._extract_model_id_from_payload(cached_entry)
)
target_dir = os.path.dirname(file_path)
base_name = os.path.basename(file_path)
file_name, main_extension = os.path.splitext(base_name)
deleted_files = await delete_model_artifacts(
target_dir, file_name, main_extension=main_extension
)
if cache:
cache.raw_data = [
item for item in cache.raw_data if item.get("file_path") != file_path
]
await cache.resort()
if hasattr(self._scanner, "_hash_index") and self._scanner._hash_index:
self._scanner._hash_index.remove_by_path(file_path)
await self._sync_update_for_model(model_id)
return {"success": True, "deleted_files": deleted_files}
@staticmethod
def _extract_model_id_from_payload(payload: Any) -> Optional[int]:
if not isinstance(payload, Mapping):
return None
civitai = payload.get("civitai")
if isinstance(civitai, Mapping):
candidate = civitai.get("modelId") or civitai.get("model_id")
if candidate is None:
model_section = civitai.get("model")
if isinstance(model_section, Mapping):
candidate = model_section.get("id")
normalized = ModelLifecycleService._coerce_int(candidate)
if normalized is not None:
return normalized
fallback = payload.get("model_id") or payload.get("civitai_model_id")
return ModelLifecycleService._coerce_int(fallback)
@staticmethod
def _coerce_int(value: Any) -> Optional[int]:
try:
return int(value)
except (TypeError, ValueError):
return None
async def _sync_update_for_model(self, model_id: Optional[int]) -> None:
if self._update_service is None or model_id is None:
return
try:
versions = await self._scanner.get_model_versions_by_id(model_id)
except Exception as exc: # pragma: no cover - defensive log
logger.debug(
"Failed to collect local versions for model %s: %s", model_id, exc
)
versions = []
version_ids = set()
for version in versions or []:
candidate = (
version.get("versionId")
or version.get("id")
or version.get("version_id")
)
normalized = ModelLifecycleService._coerce_int(candidate)
if normalized is not None:
version_ids.add(normalized)
try:
await self._update_service.update_in_library_versions(
self._scanner.model_type,
model_id,
sorted(version_ids),
)
except Exception as exc: # pragma: no cover - defensive log
logger.debug(
"Failed to sync update record for model %s: %s", model_id, exc
)
async def exclude_model(self, file_path: str) -> Dict[str, object]:
"""Mark a model as excluded and prune cache references."""
@@ -146,16 +234,19 @@ class ModelLifecycleService:
raise ValueError("Invalid characters in file name")
target_dir = os.path.dirname(file_path)
old_file_name = os.path.splitext(os.path.basename(file_path))[0]
new_file_path = os.path.join(target_dir, f"{new_file_name}.safetensors").replace(
os.sep, "/"
)
base_name = os.path.basename(file_path)
old_file_name, old_extension = os.path.splitext(base_name)
if not old_extension:
old_extension = ".safetensors"
new_file_path = os.path.join(
target_dir, f"{new_file_name}{old_extension}"
).replace(os.sep, "/")
if os.path.exists(new_file_path):
raise ValueError("A file with this name already exists")
patterns = [
f"{old_file_name}.safetensors",
f"{old_file_name}{old_extension}",
f"{old_file_name}.metadata.json",
f"{old_file_name}.metadata.json.bak",
]
@@ -248,7 +339,7 @@ class ModelLifecycleService:
return suffix
basename = os.path.basename(filename)
dot_index = basename.find(".")
dot_index = basename.rfind(".")
if dot_index != -1:
return basename[dot_index:]

View File

@@ -41,6 +41,55 @@ def _require_aiosqlite() -> Any:
logger = logging.getLogger(__name__)
class _RateLimitRetryHelper:
"""Coordinate exponential backoff retries after rate limiting."""
def __init__(
self,
*,
retry_limit: int = 3,
base_delay: float = 1.5,
max_delay: float = 30.0,
jitter_ratio: float = 0.2,
) -> None:
self._retry_limit = max(1, retry_limit)
self._base_delay = base_delay
self._max_delay = max_delay
self._jitter_ratio = max(0.0, jitter_ratio)
async def run(self, label: str, func, *args, **kwargs):
attempt = 0
while True:
try:
return await func(*args, **kwargs)
except RateLimitError as exc:
attempt += 1
if attempt >= self._retry_limit:
exc.provider = exc.provider or label
raise
delay = self._calculate_delay(exc.retry_after, attempt)
logger.warning(
"Provider %s rate limited request; retrying in %.2fs (attempt %s/%s)",
label,
delay,
attempt,
self._retry_limit,
)
await asyncio.sleep(delay)
def _calculate_delay(self, retry_after: Optional[float], attempt: int) -> float:
if retry_after is not None:
return min(self._max_delay, max(0.0, retry_after))
base_delay = self._base_delay * (2 ** max(0, attempt - 1))
jitter_span = base_delay * self._jitter_ratio
if jitter_span > 0:
base_delay += random.uniform(-jitter_span, jitter_span)
return min(self._max_delay, max(0.0, base_delay))
class ModelMetadataProvider(ABC):
"""Base abstract class for all model metadata providers"""
@@ -390,6 +439,12 @@ class FallbackMetadataProvider(ModelMetadataProvider):
self._rate_limit_base_delay = rate_limit_base_delay
self._rate_limit_max_delay = rate_limit_max_delay
self._rate_limit_jitter_ratio = max(0.0, rate_limit_jitter_ratio)
self._rate_limit_helper = _RateLimitRetryHelper(
retry_limit=self._rate_limit_retry_limit,
base_delay=self._rate_limit_base_delay,
max_delay=self._rate_limit_max_delay,
jitter_ratio=self._rate_limit_jitter_ratio,
)
async def get_model_by_hash(self, model_hash: str) -> Tuple[Optional[Dict], Optional[str]]:
for provider, label in self._iter_providers():
@@ -485,44 +540,80 @@ class FallbackMetadataProvider(ModelMetadataProvider):
def _iter_providers(self):
return zip(self.providers, self._provider_labels)
async def _call_with_rate_limit(
async def _call_with_rate_limit(self, label: str, func, *args, **kwargs):
return await self._rate_limit_helper.run(label, func, *args, **kwargs)
class RateLimitRetryingProvider(ModelMetadataProvider):
"""Adapter that retries individual provider calls after rate limiting."""
def __init__(
self,
label: str,
func,
*args,
**kwargs,
):
attempt = 0
while True:
try:
return await func(*args, **kwargs)
except RateLimitError as exc:
attempt += 1
if attempt >= self._rate_limit_retry_limit:
exc.provider = exc.provider or label
raise exc
delay = self._calculate_rate_limit_delay(exc.retry_after, attempt)
logger.warning(
"Provider %s rate limited request; retrying in %.2fs (attempt %s/%s)",
label,
delay,
attempt,
self._rate_limit_retry_limit,
)
await asyncio.sleep(delay)
except Exception:
raise
provider: ModelMetadataProvider,
label: Optional[str] = None,
*,
rate_limit_retry_limit: int = 3,
rate_limit_base_delay: float = 1.5,
rate_limit_max_delay: float = 30.0,
rate_limit_jitter_ratio: float = 0.2,
) -> None:
self._provider = provider
self._label = label or provider.__class__.__name__
self._rate_limit_helper = _RateLimitRetryHelper(
retry_limit=rate_limit_retry_limit,
base_delay=rate_limit_base_delay,
max_delay=rate_limit_max_delay,
jitter_ratio=rate_limit_jitter_ratio,
)
def _calculate_rate_limit_delay(self, retry_after: Optional[float], attempt: int) -> float:
if retry_after is not None:
return min(self._rate_limit_max_delay, max(0.0, retry_after))
def __getattr__(self, item):
return getattr(self._provider, item)
base_delay = self._rate_limit_base_delay * (2 ** max(0, attempt - 1))
jitter_span = base_delay * self._rate_limit_jitter_ratio
if jitter_span > 0:
base_delay += random.uniform(-jitter_span, jitter_span)
async def get_model_by_hash(self, model_hash: str) -> Tuple[Optional[Dict], Optional[str]]:
return await self._rate_limit_helper.run(
self._label,
self._provider.get_model_by_hash,
model_hash,
)
return min(self._rate_limit_max_delay, max(0.0, base_delay))
async def get_model_versions(self, model_id: str) -> Optional[Dict]:
return await self._rate_limit_helper.run(
self._label,
self._provider.get_model_versions,
model_id,
)
async def get_model_versions_bulk(
self,
model_ids: Sequence[int],
) -> Optional[Dict[int, Dict]]:
return await self._rate_limit_helper.run(
self._label,
self._provider.get_model_versions_bulk,
model_ids,
)
async def get_model_version(self, model_id: int = None, version_id: int = None) -> Optional[Dict]:
return await self._rate_limit_helper.run(
self._label,
self._provider.get_model_version,
model_id,
version_id,
)
async def get_model_version_info(self, version_id: str) -> Tuple[Optional[Dict], Optional[str]]:
return await self._rate_limit_helper.run(
self._label,
self._provider.get_model_version_info,
version_id,
)
async def get_user_models(self, username: str) -> Optional[List[Dict]]:
return await self._rate_limit_helper.run(
self._label,
self._provider.get_user_models,
username,
)
class ModelMetadataProviderManager:
"""Manager for selecting and using model metadata providers"""

View File

@@ -1,12 +1,49 @@
from __future__ import annotations
from dataclasses import dataclass
from typing import Any, Dict, Iterable, List, Optional, Sequence, Tuple, Protocol, Callable
from typing import Any, Dict, Iterable, List, Mapping, Optional, Sequence, Tuple, Protocol, Callable
from ..utils.constants import NSFW_LEVELS
from ..utils.utils import fuzzy_match as default_fuzzy_match
DEFAULT_CIVITAI_MODEL_TYPE = "LORA"
def _coerce_to_str(value: Any) -> Optional[str]:
if value is None:
return None
candidate = str(value).strip()
return candidate if candidate else None
def normalize_civitai_model_type(value: Any) -> Optional[str]:
"""Return a lowercase string suitable for comparisons."""
candidate = _coerce_to_str(value)
return candidate.lower() if candidate else None
def resolve_civitai_model_type(entry: Mapping[str, Any]) -> str:
"""Extract the model type from CivitAI metadata, defaulting to LORA."""
if not isinstance(entry, Mapping):
return DEFAULT_CIVITAI_MODEL_TYPE
civitai = entry.get("civitai")
if isinstance(civitai, Mapping):
civitai_model = civitai.get("model")
if isinstance(civitai_model, Mapping):
model_type = _coerce_to_str(civitai_model.get("type"))
if model_type:
return model_type
model_type = _coerce_to_str(entry.get("model_type"))
if model_type:
return model_type
return DEFAULT_CIVITAI_MODEL_TYPE
class SettingsProvider(Protocol):
"""Protocol describing the SettingsManager contract used by query helpers."""
@@ -28,9 +65,10 @@ class FilterCriteria:
folder: Optional[str] = None
base_models: Optional[Sequence[str]] = None
tags: Optional[Sequence[str]] = None
tags: Optional[Dict[str, str]] = None
favorites_only: bool = False
search_options: Optional[Dict[str, Any]] = None
model_types: Optional[Sequence[str]] = None
class ModelCacheRepository:
@@ -108,12 +146,43 @@ class ModelFilterSet:
base_model_set = set(base_models)
items = [item for item in items if item.get("base_model") in base_model_set]
tags = criteria.tags or []
if tags:
tag_set = set(tags)
tag_filters = criteria.tags or {}
include_tags = set()
exclude_tags = set()
if isinstance(tag_filters, dict):
for tag, state in tag_filters.items():
if not tag:
continue
if state == "exclude":
exclude_tags.add(tag)
else:
include_tags.add(tag)
else:
include_tags = {tag for tag in tag_filters if tag}
if include_tags:
items = [
item for item in items
if any(tag in tag_set for tag in item.get("tags", []))
if any(tag in include_tags for tag in (item.get("tags", []) or []))
]
if exclude_tags:
items = [
item for item in items
if not any(tag in exclude_tags for tag in (item.get("tags", []) or []))
]
model_types = criteria.model_types or []
normalized_model_types = {
model_type for model_type in (
normalize_civitai_model_type(value) for value in model_types
)
if model_type
}
if normalized_model_types:
items = [
item for item in items
if normalize_civitai_model_type(resolve_civitai_model_type(item)) in normalized_model_types
]
return items

View File

@@ -11,6 +11,7 @@ from ..utils.models import BaseModelMetadata
from ..config import config
from ..utils.file_utils import find_preview_file, get_preview_extension
from ..utils.metadata_manager import MetadataManager
from ..utils.civitai_utils import resolve_license_info
from .model_cache import ModelCache
from .model_hash_index import ModelHashIndex
from ..utils.constants import PREVIEW_EXTENSIONS
@@ -160,6 +161,12 @@ class ModelScanner:
if trained_words:
slim['trainedWords'] = list(trained_words) if isinstance(trained_words, list) else trained_words
civitai_model = civitai.get('model')
if isinstance(civitai_model, Mapping):
model_type_value = civitai_model.get('type')
if model_type_value not in (None, '', []):
slim['model'] = {'type': model_type_value}
return slim or None
def _build_cache_entry(
@@ -175,7 +182,17 @@ class ModelScanner:
def get_value(key: str, default: Any = None) -> Any:
if is_mapping:
return source.get(key, default)
return getattr(source, key, default)
sentinel = object()
value = getattr(source, key, sentinel)
if value is not sentinel:
return value
unknown = getattr(source, "_unknown_fields", None)
if isinstance(unknown, dict) and key in unknown:
return unknown[key]
return default
file_path = file_path_override or get_value('file_path', '') or ''
normalized_path = file_path.replace('\\', '/')
@@ -197,7 +214,8 @@ class ModelScanner:
else:
preview_url = ''
civitai_slim = self._slim_civitai_payload(get_value('civitai'))
civitai_full = get_value('civitai')
civitai_slim = self._slim_civitai_payload(civitai_full)
usage_tips = get_value('usage_tips', '') or ''
if not isinstance(usage_tips, str):
usage_tips = str(usage_tips)
@@ -229,12 +247,76 @@ class ModelScanner:
'civitai_deleted': bool(get_value('civitai_deleted', False)),
}
license_source: Dict[str, Any] = {}
if isinstance(civitai_full, Mapping):
civitai_model = civitai_full.get('model')
if isinstance(civitai_model, Mapping):
for key in (
'allowNoCredit',
'allowCommercialUse',
'allowDerivatives',
'allowDifferentLicense',
):
if key in civitai_model:
license_source[key] = civitai_model.get(key)
for key in (
'allowNoCredit',
'allowCommercialUse',
'allowDerivatives',
'allowDifferentLicense',
):
if key not in license_source:
value = get_value(key)
if value is not None:
license_source[key] = value
_, license_flags = resolve_license_info(license_source or {})
entry['license_flags'] = license_flags
model_type = get_value('model_type', None)
if model_type:
entry['model_type'] = model_type
return entry
def _ensure_license_flags(self, entry: Dict[str, Any]) -> None:
"""Ensure cached entries include an integer license flag bitset."""
if not isinstance(entry, dict):
return
license_value = entry.get('license_flags')
if license_value is not None:
try:
entry['license_flags'] = int(license_value)
except (TypeError, ValueError):
_, fallback_flags = resolve_license_info({})
entry['license_flags'] = fallback_flags
return
license_source = {
'allowNoCredit': entry.get('allowNoCredit'),
'allowCommercialUse': entry.get('allowCommercialUse'),
'allowDerivatives': entry.get('allowDerivatives'),
'allowDifferentLicense': entry.get('allowDifferentLicense'),
}
civitai_full = entry.get('civitai')
if isinstance(civitai_full, Mapping):
civitai_model = civitai_full.get('model')
if isinstance(civitai_model, Mapping):
for key in (
'allowNoCredit',
'allowCommercialUse',
'allowDerivatives',
'allowDifferentLicense',
):
if key in civitai_model:
license_source[key] = civitai_model.get(key)
_, license_flags = resolve_license_info(license_source)
entry['license_flags'] = license_flags
async def initialize_in_background(self) -> None:
"""Initialize cache in background using thread pool"""
try:
@@ -567,6 +649,7 @@ class ModelScanner:
async def _initialize_cache(self) -> None:
"""Initialize or refresh the cache"""
print("init start", flush=True)
self._is_initializing = True # Set flag
try:
start_time = time.time()
@@ -575,6 +658,7 @@ class ModelScanner:
scan_result = await self._gather_model_data()
await self._apply_scan_result(scan_result)
await self._save_persistent_cache(scan_result)
print("init end", flush=True)
logger.info(
f"{self.model_type.capitalize()} Scanner: Cache initialization completed in {time.time() - start_time:.2f} seconds, "
@@ -681,6 +765,7 @@ class ModelScanner:
model_data = self.adjust_cached_entry(dict(model_data))
if not model_data:
continue
self._ensure_license_flags(model_data)
# Add to cache
self._cache.raw_data.append(model_data)
self._cache.add_to_version_index(model_data)
@@ -975,6 +1060,7 @@ class ModelScanner:
processed_files += 1
if result:
self._ensure_license_flags(result)
raw_data.append(result)
sha_value = result.get('sha256')
@@ -1358,11 +1444,13 @@ class ModelScanner:
for file_path in file_paths:
try:
target_dir = os.path.dirname(file_path)
file_name = os.path.splitext(os.path.basename(file_path))[0]
base_name = os.path.basename(file_path)
file_name, main_extension = os.path.splitext(base_name)
deleted_files = await delete_model_artifacts(
target_dir,
file_name
file_name,
main_extension=main_extension,
)
if deleted_files:

View File

@@ -17,6 +17,41 @@ from ..utils.preview_selection import select_preview_media
logger = logging.getLogger(__name__)
def _normalize_int(value) -> Optional[int]:
"""Safely convert a value to an integer."""
try:
if value is None:
return None
return int(value)
except (TypeError, ValueError):
return None
def _normalize_string(value) -> Optional[str]:
"""Return a stripped string or None if the value is empty."""
if value is None:
return None
if isinstance(value, str):
stripped = value.strip()
return stripped or None
try:
normalized = str(value).strip()
return normalized or None
except Exception:
return None
def _normalize_base_model(value) -> Optional[str]:
"""Normalize base-model names for case-insensitive comparison."""
normalized = _normalize_string(value)
if normalized is None:
return None
return normalized.lower()
@dataclass
class ModelVersionRecord:
"""Persisted metadata for a single model version."""
@@ -85,6 +120,47 @@ class ModelUpdateRecord:
return True
return False
def has_update_for_base(
self,
local_version_id: Optional[int],
local_base_model: Optional[str],
) -> bool:
"""Return True when a newer remote version with the same base model exists."""
if self.should_ignore_model:
return False
normalized_base = _normalize_base_model(local_base_model)
if normalized_base is None:
return False
threshold = _normalize_int(local_version_id)
if threshold is None:
highest_local = None
for version in self.versions:
if not version.is_in_library:
continue
version_base = _normalize_base_model(version.base_model)
if version_base != normalized_base:
continue
if highest_local is None or version.version_id > highest_local:
highest_local = version.version_id
threshold = highest_local
if threshold is None:
return False
for version in self.versions:
if version.is_in_library or version.should_ignore:
continue
version_base = _normalize_base_model(version.base_model)
if version_base != normalized_base:
continue
if version.version_id > threshold:
return True
return False
class ModelUpdateService:
"""Persist and query remote model version metadata."""
@@ -628,6 +704,20 @@ class ModelUpdateService:
for model_id in normalized_ids
}
async def get_records_bulk(
self,
model_type: str,
model_ids: Sequence[int],
) -> Dict[int, ModelUpdateRecord]:
"""Return cached update records for the requested models."""
normalized_ids = self._normalize_sequence(model_ids)
if not normalized_ids:
return {}
async with self._lock:
return self._get_records_bulk(model_type, normalized_ids)
async def _refresh_single_model(
self,
model_type: str,
@@ -799,7 +889,7 @@ class ModelUpdateService:
)
continue
for key, value in response.items():
normalized_key = self._normalize_int(key)
normalized_key = _normalize_int(key)
if normalized_key is None:
continue
if isinstance(value, Mapping):
@@ -832,8 +922,8 @@ class ModelUpdateService:
civitai = item.get("civitai") if isinstance(item, dict) else None
if not isinstance(civitai, dict):
continue
model_id = self._normalize_int(civitai.get("modelId"))
version_id = self._normalize_int(civitai.get("id"))
model_id = _normalize_int(civitai.get("modelId"))
version_id = _normalize_int(civitai.get("id"))
if model_id is None or version_id is None:
continue
if target_set is not None and model_id not in target_set:
@@ -973,35 +1063,14 @@ class ModelUpdateService:
return True
return (now - record.last_checked_at) >= self._ttl_seconds
@staticmethod
def _normalize_int(value) -> Optional[int]:
try:
if value is None:
return None
return int(value)
except (TypeError, ValueError):
return None
def _normalize_sequence(self, values: Sequence[int]) -> List[int]:
normalized = [
item
for item in (self._normalize_int(value) for value in values)
for item in (_normalize_int(value) for value in values)
if item is not None
]
return sorted(dict.fromkeys(normalized))
@staticmethod
def _normalize_string(value) -> Optional[str]:
if value is None:
return None
if isinstance(value, str):
stripped = value.strip()
return stripped or None
try:
return str(value)
except Exception: # pragma: no cover - defensive conversion
return None
def _extract_versions(self, response) -> Optional[List[ModelVersionRecord]]:
if not isinstance(response, Mapping):
return None
@@ -1014,12 +1083,12 @@ class ModelUpdateService:
for index, entry in enumerate(versions):
if not isinstance(entry, Mapping):
continue
version_id = self._normalize_int(entry.get("id"))
version_id = _normalize_int(entry.get("id"))
if version_id is None:
continue
name = self._normalize_string(entry.get("name"))
base_model = self._normalize_string(entry.get("baseModel"))
released_at = self._normalize_string(entry.get("publishedAt") or entry.get("createdAt"))
name = _normalize_string(entry.get("name"))
base_model = _normalize_string(entry.get("baseModel"))
released_at = _normalize_string(entry.get("publishedAt") or entry.get("createdAt"))
size_bytes = self._extract_size_bytes(entry.get("files"))
preview_url = self._extract_preview_url(entry.get("images"))
extracted.append(
@@ -1152,11 +1221,11 @@ class ModelUpdateService:
name=row["name"],
base_model=row["base_model"],
released_at=row["released_at"],
size_bytes=self._normalize_int(row["size_bytes"]),
size_bytes=_normalize_int(row["size_bytes"]),
preview_url=row["preview_url"],
is_in_library=bool(row["is_in_library"]),
should_ignore=bool(row["should_ignore"]),
sort_index=self._normalize_int(row["sort_index"]) or 0,
sort_index=_normalize_int(row["sort_index"]) or 0,
)
)

View File

@@ -5,7 +5,7 @@ import re
import sqlite3
import threading
from dataclasses import dataclass
from typing import Dict, List, Optional, Sequence, Tuple
from typing import Dict, List, Mapping, Optional, Sequence, Tuple
from ..utils.settings_paths import get_project_root, get_settings_dir
@@ -21,6 +21,9 @@ class PersistedCacheData:
excluded_models: List[str]
DEFAULT_LICENSE_FLAGS = 127 # 127 (0b1111111) encodes default CivitAI permissions with all commercial modes enabled.
class PersistentModelCache:
"""Persist core model metadata and hash index data in SQLite."""
@@ -44,9 +47,11 @@ class PersistentModelCache:
"metadata_source",
"civitai_id",
"civitai_model_id",
"civitai_model_type",
"civitai_name",
"civitai_creator_username",
"trained_words",
"license_flags",
"civitai_deleted",
"exclude",
"db_checked",
@@ -134,7 +139,8 @@ class PersistentModelCache:
creator_username = row["civitai_creator_username"]
civitai: Optional[Dict] = None
civitai_has_data = any(
row[col] is not None for col in ("civitai_id", "civitai_model_id", "civitai_name")
row[col] is not None
for col in ("civitai_id", "civitai_model_id", "civitai_model_type", "civitai_name")
) or trained_words or creator_username
if civitai_has_data:
civitai = {}
@@ -148,6 +154,13 @@ class PersistentModelCache:
civitai["trainedWords"] = trained_words
if creator_username:
civitai.setdefault("creator", {})["username"] = creator_username
model_type_value = row["civitai_model_type"]
if model_type_value:
civitai.setdefault("model", {})["type"] = model_type_value
license_value = row["license_flags"]
if license_value is None:
license_value = DEFAULT_LICENSE_FLAGS
item = {
"file_path": file_path,
@@ -171,6 +184,7 @@ class PersistentModelCache:
"tags": tags.get(file_path, []),
"civitai": civitai,
"civitai_deleted": bool(row["civitai_deleted"]),
"license_flags": int(license_value),
}
raw_data.append(item)
@@ -434,6 +448,7 @@ class PersistentModelCache:
metadata_source TEXT,
civitai_id INTEGER,
civitai_model_id INTEGER,
civitai_model_type TEXT,
civitai_name TEXT,
civitai_creator_username TEXT,
trained_words TEXT,
@@ -483,7 +498,10 @@ class PersistentModelCache:
required_columns = {
"metadata_source": "TEXT",
"civitai_creator_username": "TEXT",
"civitai_model_type": "TEXT",
"civitai_deleted": "INTEGER DEFAULT 0",
# Persisting without explicit flags should assume CivitAI's documented defaults (0b111001 == 57).
"license_flags": f"INTEGER DEFAULT {DEFAULT_LICENSE_FLAGS}",
}
for column, definition in required_columns.items():
@@ -517,6 +535,17 @@ class PersistentModelCache:
creator_data = civitai.get("creator") if isinstance(civitai, dict) else None
if isinstance(creator_data, dict):
creator_username = creator_data.get("username") or None
model_type_value = None
if isinstance(civitai, Mapping):
civitai_model_info = civitai.get("model")
if isinstance(civitai_model_info, Mapping):
candidate_type = civitai_model_info.get("type")
if candidate_type not in (None, "", []):
model_type_value = candidate_type
license_flags = item.get("license_flags")
if license_flags is None:
license_flags = DEFAULT_LICENSE_FLAGS
return (
model_type,
@@ -537,9 +566,11 @@ class PersistentModelCache:
metadata_source,
civitai.get("id"),
civitai.get("modelId"),
model_type_value,
civitai.get("name"),
creator_username,
trained_words_json,
int(license_flags),
1 if item.get("civitai_deleted") else 0,
1 if item.get("exclude") else 0,
1 if item.get("db_checked") else 0,

View File

@@ -9,6 +9,7 @@ from .recipe_cache import RecipeCache
from .service_registry import ServiceRegistry
from .lora_scanner import LoraScanner
from .metadata_service import get_default_metadata_provider
from .checkpoint_scanner import CheckpointScanner
from .recipes.errors import RecipeNotFoundError
from ..utils.utils import calculate_recipe_fingerprint, fuzzy_match
from natsort import natsorted
@@ -23,24 +24,39 @@ class RecipeScanner:
_lock = asyncio.Lock()
@classmethod
async def get_instance(cls, lora_scanner: Optional[LoraScanner] = None):
async def get_instance(
cls,
lora_scanner: Optional[LoraScanner] = None,
checkpoint_scanner: Optional[CheckpointScanner] = None,
):
"""Get singleton instance of RecipeScanner"""
async with cls._lock:
if cls._instance is None:
if not lora_scanner:
# Get lora scanner from service registry if not provided
lora_scanner = await ServiceRegistry.get_lora_scanner()
cls._instance = cls(lora_scanner)
if not checkpoint_scanner:
checkpoint_scanner = await ServiceRegistry.get_checkpoint_scanner()
cls._instance = cls(lora_scanner, checkpoint_scanner)
return cls._instance
def __new__(cls, lora_scanner: Optional[LoraScanner] = None):
def __new__(
cls,
lora_scanner: Optional[LoraScanner] = None,
checkpoint_scanner: Optional[CheckpointScanner] = None,
):
if cls._instance is None:
cls._instance = super().__new__(cls)
cls._instance._lora_scanner = lora_scanner
cls._instance._checkpoint_scanner = checkpoint_scanner
cls._instance._civitai_client = None # Will be lazily initialized
return cls._instance
def __init__(self, lora_scanner: Optional[LoraScanner] = None):
def __init__(
self,
lora_scanner: Optional[LoraScanner] = None,
checkpoint_scanner: Optional[CheckpointScanner] = None,
):
# Ensure initialization only happens once
if not hasattr(self, '_initialized'):
self._cache: Optional[RecipeCache] = None
@@ -51,6 +67,8 @@ class RecipeScanner:
self._resort_tasks: Set[asyncio.Task] = set()
if lora_scanner:
self._lora_scanner = lora_scanner
if checkpoint_scanner:
self._checkpoint_scanner = checkpoint_scanner
self._initialized = True
def on_library_changed(self) -> None:
@@ -422,6 +440,14 @@ class RecipeScanner:
# Update lora information with local paths and availability
await self._update_lora_information(recipe_data)
if recipe_data.get('checkpoint'):
checkpoint_entry = self._normalize_checkpoint_entry(recipe_data['checkpoint'])
if checkpoint_entry:
recipe_data['checkpoint'] = self._enrich_checkpoint_entry(checkpoint_entry)
else:
logger.warning("Dropping invalid checkpoint entry in %s", recipe_path)
recipe_data.pop('checkpoint', None)
# Calculate and update fingerprint if missing
if 'loras' in recipe_data and 'fingerprint' not in recipe_data:
fingerprint = calculate_recipe_fingerprint(recipe_data['loras'])
@@ -564,6 +590,48 @@ class RecipeScanner:
logger.error(f"Error getting hash from Civitai: {e}")
return None, False
def _get_lora_from_version_index(self, model_version_id: Any) -> Optional[Dict[str, Any]]:
"""Quickly fetch a cached LoRA entry by modelVersionId using the version index."""
if not self._lora_scanner:
return None
cache = getattr(self._lora_scanner, "_cache", None)
if cache is None:
return None
version_index = getattr(cache, "version_index", None)
if not version_index:
return None
try:
normalized_id = int(model_version_id)
except (TypeError, ValueError):
return None
return version_index.get(normalized_id)
def _get_checkpoint_from_version_index(self, model_version_id: Any) -> Optional[Dict[str, Any]]:
"""Fetch a cached checkpoint entry by version id."""
if not self._checkpoint_scanner:
return None
cache = getattr(self._checkpoint_scanner, "_cache", None)
if cache is None:
return None
version_index = getattr(cache, "version_index", None)
if not version_index:
return None
try:
normalized_id = int(model_version_id)
except (TypeError, ValueError):
return None
return version_index.get(normalized_id)
async def _determine_base_model(self, loras: List[Dict]) -> Optional[str]:
"""Determine the most common base model among LoRAs"""
base_models = {}
@@ -602,6 +670,80 @@ class RecipeScanner:
logger.error(f"Error getting base model for lora: {e}")
return None
def _normalize_checkpoint_entry(self, checkpoint_raw: Any) -> Optional[Dict[str, Any]]:
"""Coerce legacy or malformed checkpoint entries into a dict."""
if isinstance(checkpoint_raw, dict):
return dict(checkpoint_raw)
if isinstance(checkpoint_raw, (list, tuple)) and len(checkpoint_raw) == 1:
return self._normalize_checkpoint_entry(checkpoint_raw[0])
if isinstance(checkpoint_raw, str):
name = checkpoint_raw.strip()
if not name:
return None
file_name = os.path.splitext(os.path.basename(name))[0]
return {
"name": name,
"file_name": file_name,
}
logger.warning("Unexpected checkpoint payload type %s", type(checkpoint_raw).__name__)
return None
def _enrich_checkpoint_entry(self, checkpoint: Dict[str, Any]) -> Dict[str, Any]:
"""Populate convenience fields for a checkpoint entry."""
if not checkpoint or not isinstance(checkpoint, dict) or not self._checkpoint_scanner:
return checkpoint
hash_value = (checkpoint.get('hash') or '').lower()
version_entry = None
model_version_id = checkpoint.get('id') or checkpoint.get('modelVersionId')
if not hash_value and model_version_id is not None:
version_entry = self._get_checkpoint_from_version_index(model_version_id)
try:
preview_url = checkpoint.get('preview_url') or checkpoint.get('thumbnailUrl')
if preview_url:
checkpoint['preview_url'] = self._normalize_preview_url(preview_url)
if hash_value:
checkpoint['inLibrary'] = self._checkpoint_scanner.has_hash(hash_value)
checkpoint['preview_url'] = self._normalize_preview_url(
checkpoint.get('preview_url')
or self._checkpoint_scanner.get_preview_url_by_hash(hash_value)
)
checkpoint['localPath'] = self._checkpoint_scanner.get_path_by_hash(hash_value)
elif version_entry:
checkpoint['inLibrary'] = True
cached_path = version_entry.get('file_path') or version_entry.get('path')
if cached_path:
checkpoint.setdefault('localPath', cached_path)
if not checkpoint.get('file_name'):
checkpoint['file_name'] = os.path.splitext(os.path.basename(cached_path))[0]
if version_entry.get('sha256') and not checkpoint.get('hash'):
checkpoint['hash'] = version_entry.get('sha256')
preview_url = self._normalize_preview_url(version_entry.get('preview_url'))
if preview_url:
checkpoint.setdefault('preview_url', preview_url)
if version_entry.get('model_type'):
checkpoint.setdefault('model_type', version_entry.get('model_type'))
else:
checkpoint.setdefault('inLibrary', False)
if checkpoint.get('preview_url'):
checkpoint['preview_url'] = self._normalize_preview_url(checkpoint['preview_url'])
except Exception as exc: # pragma: no cover - defensive logging
logger.debug("Error enriching checkpoint entry %s: %s", hash_value or model_version_id, exc)
return checkpoint
def _enrich_lora_entry(self, lora: Dict[str, Any]) -> Dict[str, Any]:
"""Populate convenience fields for a LoRA entry."""
@@ -609,18 +751,56 @@ class RecipeScanner:
return lora
hash_value = (lora.get('hash') or '').lower()
if not hash_value:
return lora
version_entry = None
if not hash_value and lora.get('modelVersionId') is not None:
version_entry = self._get_lora_from_version_index(lora.get('modelVersionId'))
try:
lora['inLibrary'] = self._lora_scanner.has_hash(hash_value)
lora['preview_url'] = self._lora_scanner.get_preview_url_by_hash(hash_value)
lora['localPath'] = self._lora_scanner.get_path_by_hash(hash_value)
if hash_value:
lora['inLibrary'] = self._lora_scanner.has_hash(hash_value)
lora['preview_url'] = self._normalize_preview_url(
self._lora_scanner.get_preview_url_by_hash(hash_value)
)
lora['localPath'] = self._lora_scanner.get_path_by_hash(hash_value)
elif version_entry:
lora['inLibrary'] = True
cached_path = version_entry.get('file_path') or version_entry.get('path')
if cached_path:
lora.setdefault('localPath', cached_path)
if not lora.get('file_name'):
lora['file_name'] = os.path.splitext(os.path.basename(cached_path))[0]
if version_entry.get('sha256') and not lora.get('hash'):
lora['hash'] = version_entry.get('sha256')
preview_url = self._normalize_preview_url(version_entry.get('preview_url'))
if preview_url:
lora.setdefault('preview_url', preview_url)
else:
lora.setdefault('inLibrary', False)
if lora.get('preview_url'):
lora['preview_url'] = self._normalize_preview_url(lora['preview_url'])
except Exception as exc: # pragma: no cover - defensive logging
logger.debug("Error enriching lora entry %s: %s", hash_value, exc)
return lora
def _normalize_preview_url(self, preview_url: Optional[str]) -> Optional[str]:
"""Return a preview URL that is reachable from the browser."""
if not preview_url or not isinstance(preview_url, str):
return preview_url
normalized = preview_url.strip()
if normalized.startswith("/api/lm/previews?path="):
return normalized
if os.path.isabs(normalized):
return config.get_preview_static_url(normalized)
return normalized
async def get_local_lora(self, name: str) -> Optional[Dict[str, Any]]:
"""Lookup a local LoRA model by name."""
@@ -729,10 +909,32 @@ class RecipeScanner:
# Filter by tags
if 'tags' in filters and filters['tags']:
filtered_data = [
item for item in filtered_data
if any(tag in item.get('tags', []) for tag in filters['tags'])
]
tag_spec = filters['tags']
include_tags = set()
exclude_tags = set()
if isinstance(tag_spec, dict):
for tag, state in tag_spec.items():
if not tag:
continue
if state == 'exclude':
exclude_tags.add(tag)
else:
include_tags.add(tag)
else:
include_tags = {tag for tag in tag_spec if tag}
if include_tags:
filtered_data = [
item for item in filtered_data
if any(tag in include_tags for tag in (item.get('tags', []) or []))
]
if exclude_tags:
filtered_data = [
item for item in filtered_data
if not any(tag in exclude_tags for tag in (item.get('tags', []) or []))
]
# Calculate pagination
total_items = len(filtered_data)
@@ -746,6 +948,12 @@ class RecipeScanner:
for item in paginated_items:
if 'loras' in item:
item['loras'] = [self._enrich_lora_entry(dict(lora)) for lora in item['loras']]
if item.get('checkpoint'):
checkpoint_entry = self._normalize_checkpoint_entry(item['checkpoint'])
if checkpoint_entry:
item['checkpoint'] = self._enrich_checkpoint_entry(checkpoint_entry)
else:
item.pop('checkpoint', None)
result = {
'items': paginated_items,
@@ -793,6 +1001,12 @@ class RecipeScanner:
# Add lora metadata
if 'loras' in formatted_recipe:
formatted_recipe['loras'] = [self._enrich_lora_entry(dict(lora)) for lora in formatted_recipe['loras']]
if formatted_recipe.get('checkpoint'):
checkpoint_entry = self._normalize_checkpoint_entry(formatted_recipe['checkpoint'])
if checkpoint_entry:
formatted_recipe['checkpoint'] = self._enrich_checkpoint_entry(checkpoint_entry)
else:
formatted_recipe.pop('checkpoint', None)
return formatted_recipe

View File

@@ -107,6 +107,12 @@ class RecipeAnalysisService:
raise RecipeDownloadError("No image URL found in Civitai response")
await self._download_image(image_url, temp_path)
metadata = image_info.get("meta") if "meta" in image_info else None
if (
isinstance(metadata, dict)
and "meta" in metadata
and isinstance(metadata["meta"], dict)
):
metadata = metadata["meta"]
else:
await self._download_image(url, temp_path)

View File

@@ -78,15 +78,15 @@ class RecipePersistenceService:
file_obj.write(optimized_image)
current_time = time.time()
loras_data = [self._normalise_lora_entry(lora) for lora in metadata.get("loras", [])]
loras_data = [self._normalise_lora_entry(lora) for lora in (metadata.get("loras") or [])]
checkpoint_entry = self._sanitize_checkpoint_entry(self._extract_checkpoint_entry(metadata))
gen_params = metadata.get("gen_params", {})
gen_params = metadata.get("gen_params") or {}
if not gen_params and "raw_metadata" in metadata:
raw_metadata = metadata.get("raw_metadata", {})
gen_params = {
"prompt": raw_metadata.get("prompt", ""),
"negative_prompt": raw_metadata.get("negative_prompt", ""),
"checkpoint": raw_metadata.get("checkpoint", {}),
"steps": raw_metadata.get("steps", ""),
"sampler": raw_metadata.get("sampler", ""),
"cfg_scale": raw_metadata.get("cfg_scale", ""),
@@ -95,6 +95,9 @@ class RecipePersistenceService:
"clip_skip": raw_metadata.get("clip_skip", ""),
}
# Drop checkpoint duplication from generation parameters to store it only at top level
gen_params.pop("checkpoint", None)
fingerprint = calculate_recipe_fingerprint(loras_data)
recipe_data: Dict[str, Any] = {
"id": recipe_id,
@@ -107,6 +110,8 @@ class RecipePersistenceService:
"gen_params": gen_params,
"fingerprint": fingerprint,
}
if checkpoint_entry:
recipe_data["checkpoint"] = checkpoint_entry
tags_list = list(tags)
if tags_list:
@@ -295,8 +300,6 @@ class RecipePersistenceService:
lora_stack = metadata.get("loras", "")
lora_matches = re.findall(r"<lora:([^:]+):([^>]+)>", lora_stack)
if not lora_matches:
raise RecipeValidationError("No LoRAs found in the generation metadata")
loras_data = []
base_model_counts: Dict[str, int] = {}
@@ -332,7 +335,7 @@ class RecipePersistenceService:
"created_date": time.time(),
"base_model": most_common_base_model,
"loras": loras_data,
"checkpoint": metadata.get("checkpoint", ""),
"checkpoint": self._sanitize_checkpoint_entry(metadata.get("checkpoint", "")),
"gen_params": {
key: value
for key, value in metadata.items()
@@ -361,6 +364,30 @@ class RecipePersistenceService:
# Helper methods ---------------------------------------------------
def _extract_checkpoint_entry(self, metadata: dict[str, Any]) -> Optional[dict[str, Any]]:
"""Pull a checkpoint entry from various metadata locations."""
checkpoint_entry = metadata.get("checkpoint") or metadata.get("model")
if not checkpoint_entry:
gen_params = metadata.get("gen_params") or {}
checkpoint_entry = gen_params.get("checkpoint")
return checkpoint_entry if isinstance(checkpoint_entry, dict) else None
def _sanitize_checkpoint_entry(self, checkpoint_entry: Optional[dict[str, Any]]) -> Optional[dict[str, Any]]:
"""Remove transient/local-only fields from checkpoint metadata."""
if not checkpoint_entry:
return None
if not isinstance(checkpoint_entry, dict):
return checkpoint_entry
pruned = dict(checkpoint_entry)
for key in ("existsLocally", "localPath", "thumbnailUrl", "size", "downloadUrl"):
pruned.pop(key, None)
return pruned
def _resolve_image_bytes(self, image_bytes: bytes | None, image_base64: str | None) -> bytes:
if image_bytes is not None:
return image_bytes

View File

@@ -2,14 +2,17 @@ import asyncio
import copy
import json
import os
import shutil
import logging
from pathlib import Path
from datetime import datetime, timezone
from threading import Lock
from typing import Any, Awaitable, Dict, Iterable, List, Mapping, Optional, Sequence, Tuple
from ..utils.constants import DEFAULT_PRIORITY_TAG_CONFIG
from ..utils.settings_paths import ensure_settings_file
from platformdirs import user_config_dir
from ..utils.constants import DEFAULT_HASH_CHUNK_SIZE_MB, DEFAULT_PRIORITY_TAG_CONFIG
from ..utils.settings_paths import APP_NAME, ensure_settings_file, get_legacy_settings_path
from ..utils.tag_priorities import (
PriorityTagEntry,
collect_canonical_tags,
@@ -29,6 +32,7 @@ CORE_USER_SETTING_KEYS: Tuple[str, ...] = (
DEFAULT_SETTINGS: Dict[str, Any] = {
"civitai_api_key": "",
"use_portable_settings": False,
"hash_chunk_size_mb": DEFAULT_HASH_CHUNK_SIZE_MB,
"language": "en",
"show_only_sfw": False,
"enable_metadata_archive_db": False,
@@ -57,12 +61,15 @@ DEFAULT_SETTINGS: Dict[str, Any] = {
"priority_tags": DEFAULT_PRIORITY_TAG_CONFIG.copy(),
"model_name_display": "model_name",
"model_card_footer_action": "example_images",
"update_flag_strategy": "same_base",
"auto_organize_exclusions": [],
}
class SettingsManager:
def __init__(self):
self.settings_file = ensure_settings_file(logger)
self._pending_portable_switch: Optional[Dict[str, str]] = None
self._standalone_mode = self._detect_standalone_mode()
self._startup_messages: List[Dict[str, Any]] = []
self._needs_initial_save = False
@@ -233,6 +240,17 @@ class SettingsManager:
)
inserted_defaults = True
if "auto_organize_exclusions" in self.settings:
normalized_exclusions = self.normalize_auto_organize_exclusions(
self.settings.get("auto_organize_exclusions")
)
if normalized_exclusions != self.settings.get("auto_organize_exclusions"):
self.settings["auto_organize_exclusions"] = normalized_exclusions
updated_existing = True
else:
self.settings["auto_organize_exclusions"] = []
inserted_defaults = True
for key, value in defaults.items():
if key == "priority_tags":
continue
@@ -713,6 +731,7 @@ class SettingsManager:
defaults['download_path_templates'] = {}
defaults['priority_tags'] = DEFAULT_PRIORITY_TAG_CONFIG.copy()
defaults.setdefault('folder_paths', {})
defaults['auto_organize_exclusions'] = []
library_name = defaults.get("active_library") or "default"
default_library = self._build_library_payload(
@@ -738,6 +757,35 @@ class SettingsManager:
return normalized
def normalize_auto_organize_exclusions(self, value: Any) -> List[str]:
if value is None:
return []
if isinstance(value, str):
candidates: Iterable[str] = (
value.replace("\n", ",").replace(";", ",").split(",")
)
elif isinstance(value, Sequence) and not isinstance(value, (bytes, bytearray, str)):
candidates = value
else:
return []
patterns: List[str] = []
for raw in candidates:
if isinstance(raw, str):
token = raw.strip()
if token:
patterns.append(token)
unique_patterns: List[str] = []
seen = set()
for pattern in patterns:
if pattern not in seen:
seen.add(pattern)
unique_patterns.append(pattern)
return unique_patterns
def get_priority_tag_config(self) -> Dict[str, str]:
stored_value = self.settings.get("priority_tags")
normalized = self._normalize_priority_tag_config(stored_value)
@@ -746,6 +794,15 @@ class SettingsManager:
self._save_settings()
return normalized.copy()
def get_auto_organize_exclusions(self) -> List[str]:
exclusions = self.normalize_auto_organize_exclusions(
self.settings.get("auto_organize_exclusions")
)
if exclusions != self.settings.get("auto_organize_exclusions"):
self.settings["auto_organize_exclusions"] = exclusions
self._save_settings()
return exclusions
def get_startup_messages(self) -> List[Dict[str, Any]]:
return [message.copy() for message in self._startup_messages]
@@ -781,7 +838,13 @@ class SettingsManager:
def set(self, key: str, value: Any) -> None:
"""Set setting value and save"""
if key == "auto_organize_exclusions":
value = self.normalize_auto_organize_exclusions(value)
self.settings[key] = value
portable_switch_pending = False
if key == "use_portable_settings" and isinstance(value, bool):
portable_switch_pending = True
self._prepare_portable_switch(value)
if key == 'folder_paths' and isinstance(value, Mapping):
self._update_active_library_entry(folder_paths=value) # type: ignore[arg-type]
elif key == 'default_lora_root':
@@ -793,6 +856,8 @@ class SettingsManager:
elif key == 'model_name_display':
self._notify_model_name_display_change(value)
self._save_settings()
if portable_switch_pending:
self._finalize_portable_switch()
def delete(self, key: str) -> None:
"""Delete setting key and save"""
@@ -801,6 +866,113 @@ class SettingsManager:
self._save_settings()
logger.info(f"Deleted setting: {key}")
def _prepare_portable_switch(self, use_portable: bool) -> None:
"""Prepare switching the settings storage location."""
legacy_path = get_legacy_settings_path()
user_dir = self._get_user_config_directory()
user_settings_path = os.path.join(user_dir, "settings.json")
target_path = legacy_path if use_portable else user_settings_path
other_path = user_settings_path if use_portable else legacy_path
target_dir = os.path.dirname(target_path)
os.makedirs(target_dir, exist_ok=True)
previous_path = self.settings_file or target_path
previous_dir = os.path.dirname(previous_path) or target_dir
if os.path.abspath(previous_path) != os.path.abspath(target_path):
self._copy_model_cache_directory(previous_dir, target_dir)
self._pending_portable_switch = {"other_path": other_path}
self.settings_file = target_path
def _finalize_portable_switch(self) -> None:
"""Mirror the latest settings file to the secondary location."""
info = self._pending_portable_switch
if not info:
return
other_path = info.get("other_path")
current_path = self.settings_file
if not other_path or not current_path:
self._pending_portable_switch = None
return
if os.path.abspath(other_path) == os.path.abspath(current_path):
self._pending_portable_switch = None
return
other_dir = os.path.dirname(other_path) or os.path.dirname(current_path)
if other_dir:
os.makedirs(other_dir, exist_ok=True)
try:
shutil.copy2(current_path, other_path)
except Exception as exc:
logger.warning("Failed to mirror settings.json to %s: %s", other_path, exc)
finally:
self._pending_portable_switch = None
def _copy_model_cache_directory(self, source_dir: str, target_dir: str) -> None:
"""Copy model_cache artifacts when switching storage locations."""
if not source_dir or not target_dir:
return
source_cache_dir = os.path.join(source_dir, "model_cache")
target_cache_dir = os.path.join(target_dir, "model_cache")
if (
os.path.isdir(source_cache_dir)
and os.path.abspath(source_cache_dir) != os.path.abspath(target_cache_dir)
):
try:
shutil.copytree(source_cache_dir, target_cache_dir, dirs_exist_ok=True)
except Exception as exc:
logger.warning(
"Failed to copy model_cache directory from %s to %s: %s",
source_cache_dir,
target_cache_dir,
exc,
)
source_cache_file = os.path.join(source_dir, "model_cache.sqlite")
target_cache_file = os.path.join(target_dir, "model_cache.sqlite")
if (
os.path.isfile(source_cache_file)
and os.path.abspath(source_cache_file) != os.path.abspath(target_cache_file)
):
try:
shutil.copy2(source_cache_file, target_cache_file)
except Exception as exc:
logger.warning(
"Failed to copy model_cache.sqlite from %s to %s: %s",
source_cache_file,
target_cache_file,
exc,
)
def _get_user_config_directory(self) -> str:
"""Return the user configuration directory, falling back to ~/.config."""
try:
config_dir = user_config_dir(APP_NAME, appauthor=False) or ""
except Exception as exc: # pragma: no cover - defensive fallback
logger.warning("Failed to determine user config directory: %s", exc)
config_dir = ""
if not config_dir:
config_dir = os.path.join(os.path.expanduser("~"), f".config/{APP_NAME}")
try:
os.makedirs(config_dir, exist_ok=True)
except Exception as exc:
logger.warning("Failed to create user config directory %s: %s", config_dir, exc)
return config_dir
def _notify_model_name_display_change(self, value: Any) -> None:
"""Trigger cache resorting when the model name display preference updates."""

View File

@@ -33,7 +33,8 @@ class TagUpdateService:
tags_added: List[str] = []
for tag in new_tags:
if isinstance(tag, str) and tag.strip():
normalized = tag.strip()
# Convert all tags to lowercase to avoid case sensitivity issues on Windows
normalized = tag.strip().lower()
if normalized.lower() not in existing_lower:
existing_tags.append(normalized)
existing_lower.append(normalized.lower())

View File

@@ -39,6 +39,7 @@ class AutoOrganizeUseCase:
*,
file_paths: Optional[Sequence[str]] = None,
progress_callback: Optional[ProgressCallback] = None,
exclusion_patterns: Optional[Sequence[str]] = None,
) -> AutoOrganizeResult:
"""Run the auto-organize routine guarded by a shared lock."""
@@ -53,4 +54,5 @@ class AutoOrganizeUseCase:
return await self._file_service.auto_organize_models(
file_paths=list(file_paths) if file_paths is not None else None,
progress_callback=progress_callback,
exclusion_patterns=exclusion_patterns,
)

View File

@@ -2,9 +2,138 @@
from __future__ import annotations
from typing import Any, Dict, Iterable, Mapping, Sequence
from urllib.parse import urlparse, urlunparse
_DEFAULT_ALLOW_COMMERCIAL_USE: Sequence[str] = ("Sell",)
_LICENSE_DEFAULTS: Dict[str, Any] = {
"allowNoCredit": True,
"allowCommercialUse": _DEFAULT_ALLOW_COMMERCIAL_USE,
"allowDerivatives": True,
"allowDifferentLicense": True,
}
_COMMERCIAL_ALLOWED_VALUES = {"sell", "rent", "rentcivit", "image"}
_COMMERCIAL_SHIFT = 1
def _normalize_commercial_values(value: Any) -> Sequence[str]:
"""Return a normalized list of commercial permissions preserving source values."""
if value is None:
return list(_DEFAULT_ALLOW_COMMERCIAL_USE)
if isinstance(value, str):
return [value]
if isinstance(value, Iterable):
result = []
for item in value:
if item is None:
continue
if isinstance(item, str):
result.append(item)
continue
result.append(str(item))
if result:
return result
try:
if len(value) == 0: # type: ignore[arg-type]
return []
except TypeError:
pass
return list(_DEFAULT_ALLOW_COMMERCIAL_USE)
def _to_bool(value: Any, fallback: bool) -> bool:
if value is None:
return fallback
return bool(value)
def resolve_license_payload(model_data: Mapping[str, Any] | None) -> Dict[str, Any]:
"""Extract license fields from model metadata applying documented defaults."""
payload: Dict[str, Any] = {}
allow_no_credit = payload["allowNoCredit"] = _to_bool(
(model_data or {}).get("allowNoCredit"),
_LICENSE_DEFAULTS["allowNoCredit"],
)
commercial = _normalize_commercial_values(
(model_data or {}).get("allowCommercialUse"),
)
payload["allowCommercialUse"] = list(commercial)
allow_derivatives = payload["allowDerivatives"] = _to_bool(
(model_data or {}).get("allowDerivatives"),
_LICENSE_DEFAULTS["allowDerivatives"],
)
allow_different_license = payload["allowDifferentLicense"] = _to_bool(
(model_data or {}).get("allowDifferentLicense"),
_LICENSE_DEFAULTS["allowDifferentLicense"],
)
# Ensure booleans are plain bool instances
payload["allowNoCredit"] = bool(allow_no_credit)
payload["allowDerivatives"] = bool(allow_derivatives)
payload["allowDifferentLicense"] = bool(allow_different_license)
return payload
def _resolve_commercial_bits(values: Sequence[str]) -> int:
normalized_values = set()
for value in values:
normalized = str(value).strip().lower().replace("_", "").replace("-", "")
if normalized in _COMMERCIAL_ALLOWED_VALUES:
normalized_values.add(normalized)
has_sell = "sell" in normalized_values
has_rent = has_sell or "rent" in normalized_values
has_rentcivit = has_rent or "rentcivit" in normalized_values
has_image = has_sell or "image" in normalized_values
commercial_bits = (
(1 if has_sell else 0) << 3
| (1 if has_rent else 0) << 2
| (1 if has_rentcivit else 0) << 1
| (1 if has_image else 0)
)
return commercial_bits << _COMMERCIAL_SHIFT
def build_license_flags(payload: Mapping[str, Any] | None) -> int:
"""Encode license payload into a compact bitset for cache storage."""
resolved = resolve_license_payload(payload or {})
flags = 0
if resolved.get("allowNoCredit", True):
flags |= 1 << 0
commercial_bits = _resolve_commercial_bits(resolved.get("allowCommercialUse", ()))
flags |= commercial_bits
if resolved.get("allowDerivatives", True):
flags |= 1 << 5
if resolved.get("allowDifferentLicense", True):
flags |= 1 << 6
return flags
def resolve_license_info(model_data: Mapping[str, Any] | None) -> tuple[Dict[str, Any], int]:
"""Return normalized license payload and its encoded bitset."""
payload = resolve_license_payload(model_data)
return payload, build_license_flags(payload)
def rewrite_preview_url(source_url: str | None, media_type: str | None = None) -> tuple[str | None, bool]:
"""Rewrite Civitai preview URLs to use optimized renditions.
@@ -43,5 +172,9 @@ def rewrite_preview_url(source_url: str | None, media_type: str | None = None) -
return rewritten, True
__all__ = ["rewrite_preview_url"]
__all__ = [
"build_license_flags",
"resolve_license_payload",
"resolve_license_info",
"rewrite_preview_url",
]

View File

@@ -55,6 +55,9 @@ CIVITAI_USER_MODEL_TYPES = [
'checkpoint',
]
# Default chunk size in megabytes used for hashing large files.
DEFAULT_HASH_CHUNK_SIZE_MB = 4
# Auto-organize settings
AUTO_ORGANIZE_BATCH_SIZE = 50 # Process models in batches to avoid overwhelming the system

View File

@@ -140,6 +140,28 @@ class ExifUtils:
if metadata:
# Remove any existing recipe metadata
metadata = ExifUtils.remove_recipe_metadata(metadata)
# Prepare checkpoint data
checkpoint_data = recipe_data.get("checkpoint") or {}
simplified_checkpoint = None
if isinstance(checkpoint_data, dict) and checkpoint_data:
simplified_checkpoint = {
"type": checkpoint_data.get("type", "checkpoint"),
"modelId": checkpoint_data.get("modelId", 0),
"modelVersionId": checkpoint_data.get("modelVersionId")
or checkpoint_data.get("id", 0),
"modelName": checkpoint_data.get(
"modelName", checkpoint_data.get("name", "")
),
"modelVersionName": checkpoint_data.get(
"modelVersionName", checkpoint_data.get("version", "")
),
"hash": checkpoint_data.get("hash", "").lower()
if checkpoint_data.get("hash")
else "",
"file_name": checkpoint_data.get("file_name", ""),
"baseModel": checkpoint_data.get("baseModel", ""),
}
# Prepare simplified loras data
simplified_loras = []
@@ -160,7 +182,8 @@ class ExifUtils:
'base_model': recipe_data.get('base_model', ''),
'loras': simplified_loras,
'gen_params': recipe_data.get('gen_params', {}),
'tags': recipe_data.get('tags', [])
'tags': recipe_data.get('tags', []),
**({'checkpoint': simplified_checkpoint} if simplified_checkpoint else {})
}
# Convert to JSON string
@@ -359,4 +382,4 @@ class ExifUtils:
return f.read(), os.path.splitext(image_data)[1]
except Exception:
return image_data, '.jpg' # Last resort fallback
return image_data, '.jpg'
return image_data, '.jpg'

View File

@@ -1,17 +1,41 @@
import hashlib
import logging
import os
import hashlib
from .constants import PREVIEW_EXTENSIONS, CARD_PREVIEW_WIDTH
from .constants import (
CARD_PREVIEW_WIDTH,
DEFAULT_HASH_CHUNK_SIZE_MB,
PREVIEW_EXTENSIONS,
)
from .exif_utils import ExifUtils
from ..services.settings_manager import get_settings_manager
logger = logging.getLogger(__name__)
def _get_hash_chunk_size_bytes() -> int:
"""Return the chunk size used for hashing, in bytes."""
settings_manager = get_settings_manager()
chunk_size_mb = settings_manager.get("hash_chunk_size_mb", DEFAULT_HASH_CHUNK_SIZE_MB)
try:
chunk_size_value = float(chunk_size_mb)
except (TypeError, ValueError):
chunk_size_value = float(DEFAULT_HASH_CHUNK_SIZE_MB)
if chunk_size_value <= 0:
chunk_size_value = float(DEFAULT_HASH_CHUNK_SIZE_MB)
return max(1, int(chunk_size_value * 1024 * 1024))
async def calculate_sha256(file_path: str) -> str:
"""Calculate SHA256 hash of a file"""
sha256_hash = hashlib.sha256()
chunk_size = _get_hash_chunk_size_bytes()
with open(file_path, "rb") as f:
for byte_block in iter(lambda: f.read(128 * 1024), b""):
for byte_block in iter(lambda: f.read(chunk_size), b""):
sha256_hash.update(byte_block)
return sha256_hash.hexdigest()
@@ -81,4 +105,4 @@ def get_preview_extension(preview_path: str) -> str:
def normalize_path(path: str) -> str:
"""Normalize file path to use forward slashes"""
return path.replace(os.sep, "/") if path else path
return path.replace(os.sep, "/") if path else path

View File

@@ -22,7 +22,7 @@ class MetadataManager:
"""
@staticmethod
async def load_metadata(file_path: str, model_class: Type[BaseModelMetadata] = LoraMetadata) -> Optional[BaseModelMetadata]:
async def load_metadata(file_path: str, model_class: Type[BaseModelMetadata] = LoraMetadata) -> tuple[Optional[BaseModelMetadata], bool]:
"""
Load metadata safely.

View File

@@ -205,7 +205,9 @@ def calculate_relative_path_for_model(model_data: Dict, model_type: str = 'lora'
base_model_mappings = settings_manager.get('base_model_path_mappings', {})
mapped_base_model = base_model_mappings.get(base_model, base_model)
first_tag = settings_manager.resolve_priority_tag_for_model(model_tags, model_type)
# Convert all tags to lowercase to avoid case sensitivity issues on Windows
lowercase_tags = [tag.lower() for tag in model_tags if isinstance(tag, str)]
first_tag = settings_manager.resolve_priority_tag_for_model(lowercase_tags, model_type)
if not first_tag:
first_tag = 'no tags' # Default if no tags available

View File

@@ -1,7 +1,7 @@
[project]
name = "comfyui-lora-manager"
description = "Revolutionize your workflow with the ultimate LoRA companion for ComfyUI!"
version = "0.9.9"
version = "0.9.11"
license = {file = "LICENSE"}
dependencies = [
"aiohttp",

View File

@@ -11,7 +11,11 @@
"type": "LORA",
"nsfw": false,
"description": "description",
"tags": ["style"]
"tags": ["style"],
"allowNoCredit": true,
"allowCommercialUse": ["Sell"],
"allowDerivatives": true,
"allowDifferentLicense": true
},
"files": [
{

View File

@@ -14,5 +14,6 @@
"C:/path/to/your/embeddings_folder",
"C:/path/to/another/embeddings_folder"
]
}
},
"auto_organize_exclusions": []
}

View File

@@ -48,9 +48,11 @@ html, body {
/* Composed Colors */
--lora-accent: oklch(var(--lora-accent-l) var(--lora-accent-c) var(--lora-accent-h));
--lora-surface: oklch(97% 0 0 / 0.95);
--lora-border: oklch(90% 0.02 256 / 0.15);
--lora-border: oklch(72% 0.03 256 / 0.45);
--lora-text: oklch(95% 0.02 256);
--lora-error: oklch(75% 0.32 29);
--lora-error-bg: color-mix(in oklch, var(--lora-error) 20%, transparent);
--lora-error-border: color-mix(in oklch, var(--lora-error) 50%, transparent);
--lora-warning: oklch(var(--lora-warning-l) var(--lora-warning-c) var(--lora-warning-h));
--lora-success: oklch(var(--lora-success-l) var(--lora-success-c) var(--lora-success-h));
--badge-update-bg: oklch(72% 0.2 220);
@@ -103,6 +105,8 @@ html[data-theme="light"] {
--lora-border: oklch(90% 0.02 256 / 0.15);
--lora-text: oklch(98% 0.02 256);
--lora-warning: oklch(75% 0.25 80); /* Modified to be used with oklch() */
--lora-error-bg: color-mix(in oklch, var(--lora-error) 15%, transparent);
--lora-error-border: color-mix(in oklch, var(--lora-error) 40%, transparent);
--badge-update-bg: oklch(62% 0.18 220);
--badge-update-text: oklch(98% 0.02 240);
--badge-update-glow: oklch(62% 0.18 220 / 0.4);

View File

@@ -9,6 +9,42 @@
border-bottom: 1px solid var(--lora-border);
}
.modal-header-actions {
display: flex;
align-items: center;
gap: var(--space-2);
flex-wrap: wrap;
width: 100%;
margin-bottom: var(--space-1);
}
.modal-header-actions .license-restrictions {
margin-left: auto;
}
.license-restrictions {
display: flex;
align-items: center;
gap: 8px;
padding: 4px 0;
}
.license-restrictions .license-icon {
width: 22px;
height: 22px;
display: inline-block;
background-color: var(--text-muted);
-webkit-mask: var(--license-icon-image) center/contain no-repeat;
mask: var(--license-icon-image) center/contain no-repeat;
transition: background-color 0.2s ease, transform 0.2s ease;
cursor: default;
}
.license-restrictions .license-icon:hover {
background-color: var(--text-color);
transform: translateY(-1px);
}
/* Info Grid */
.info-grid {
display: grid;
@@ -798,7 +834,7 @@
display: flex;
align-items: center;
gap: 10px;
margin-bottom: var(--space-1);
margin-bottom: 0;
flex-wrap: wrap;
}

View File

@@ -24,12 +24,29 @@
color: var(--text-color);
}
.sr-only {
position: absolute;
width: 1px;
height: 1px;
padding: 0;
margin: -1px;
overflow: hidden;
clip: rect(0, 0, 0, 0);
border: 0;
}
.versions-toolbar-info p {
margin: 0;
font-size: 0.85rem;
color: var(--text-muted);
}
.versions-toolbar-info-heading {
display: flex;
align-items: center;
gap: var(--space-2);
}
.versions-toolbar-actions {
display: flex;
flex-wrap: wrap;
@@ -68,6 +85,41 @@
color: var(--text-color);
}
.versions-filter-toggle {
appearance: none;
border: 1px solid var(--border-color);
border-radius: var(--border-radius-sm);
padding: 0;
margin-bottom: 4px;
width: 30px;
height: 30px;
background: color-mix(in oklch, var(--card-bg) 80%, var(--bg-color));
align-self: center;
display: inline-flex;
align-items: center;
justify-content: center;
color: var(--text-muted);
transition: border-color 0.2s ease, background 0.2s ease, color 0.2s ease, transform 0.2s ease;
position: relative;
cursor: pointer;
}
.versions-filter-toggle i {
font-size: 1rem;
}
.versions-filter-toggle:hover:not(:disabled) {
border-color: var(--text-color);
color: var(--text-color);
transform: translateY(-1px);
}
.versions-filter-toggle[data-filter-active="true"] {
border-color: color-mix(in oklch, var(--lora-accent) 65%, transparent);
color: var(--lora-accent);
background: color-mix(in oklch, var(--lora-accent) 20%, var(--card-bg) 80%);
}
.versions-toolbar-btn:disabled {
opacity: 0.6;
cursor: not-allowed;

View File

@@ -315,7 +315,8 @@ button:disabled,
overflow: hidden;
}
.delete-preview img {
.delete-preview img,
.delete-preview video {
width: 100%;
height: auto;
max-height: 150px;
@@ -345,4 +346,4 @@ button:disabled,
font-style: italic;
margin-top: var(--space-1);
text-align: center;
}
}

View File

@@ -233,6 +233,11 @@
resize: vertical;
}
.auto-organize-exclusions-input {
width: 100%;
box-sizing: border-box;
}
.priority-tags-input:focus {
border-color: var(--lora-accent);
outline: none;
@@ -261,6 +266,10 @@
margin-bottom: 0;
}
.auto-organize-exclusions-item {
gap: var(--space-2);
}
.priority-tags-example {
font-size: 0.85em;
opacity: 0.8;

View File

@@ -588,6 +588,26 @@
padding-top: 4px; /* Add padding to prevent first item from being cut off when hovered */
}
.recipe-resources-list {
display: flex;
flex-direction: column;
gap: 10px;
flex: 1;
min-height: 0;
}
.recipe-checkpoint-container {
display: flex;
flex-direction: column;
gap: var(--space-1);
}
.version-divider {
height: 1px;
background: var(--border-color);
margin: var(--space-1) 0;
}
.recipe-lora-item {
display: flex;
gap: var(--space-2);
@@ -614,6 +634,13 @@
border-left: 4px solid var(--lora-accent);
}
.recipe-lora-item.checkpoint-item {
cursor: pointer;
padding-top: 8px;
padding-bottom: 8px;
align-items: center;
}
.recipe-lora-item.missing-locally {
border-left: 4px solid var(--lora-error);
}
@@ -962,6 +989,10 @@
z-index: 100;
}
.badge-container .resource-action {
margin-left: auto;
}
/* Add styles for missing LoRAs download feature */
.recipe-status.missing {
position: relative;
@@ -1004,3 +1035,61 @@
.recipe-status.clickable:hover {
background-color: rgba(var(--lora-warning-rgb, 255, 165, 0), 0.2);
}
.recipe-checkpoint-meta {
display: flex;
flex-wrap: wrap;
gap: 8px;
align-items: center;
font-size: 0.85em;
margin-bottom: 2px;
}
.recipe-checkpoint-meta .checkpoint-type {
background: var(--lora-surface);
padding: 2px 8px;
border-radius: var(--border-radius-xs);
color: var(--text-color);
}
.recipe-resource-actions {
display: flex;
flex-wrap: wrap;
align-items: center;
gap: 8px;
margin-top: 2px;
}
.resource-action {
display: inline-flex;
align-items: center;
gap: 6px;
padding: 5px 10px;
border-radius: var(--border-radius-xs);
border: 1px solid var(--border-color);
background: var(--bg-color);
color: var(--text-color);
font-size: 0.9em;
cursor: pointer;
transition: background-color 0.2s ease, border-color 0.2s ease, transform 0.2s ease;
}
.resource-action.compact {
padding: 4px 10px;
font-size: 0.88em;
}
.resource-action:hover {
background: var(--lora-surface);
transform: translateY(-1px);
}
.resource-action.primary {
background: var(--lora-accent);
color: white;
border-color: var(--lora-accent);
}
.resource-action.primary:hover {
background: color-mix(in oklch, var(--lora-accent), black 10%);
}

View File

@@ -235,6 +235,13 @@
border-color: var(--lora-accent);
}
/* Exclude state styling for filter tags */
.filter-tag.exclude {
background-color: var(--lora-error-bg);
color: var(--lora-error);
border-color: var(--lora-error-border);
}
/* Tag filter styles */
.tag-filter {
display: flex;

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="icon icon-tabler icons-tabler-outline icon-tabler-brush-off"><path stroke="none" d="M0 0h24v24H0z" fill="none"/><path d="M3 17a4 4 0 1 1 4 4h-4v-4z" /><path d="M21 3a16 16 0 0 0 -9.309 4.704m-1.795 2.212a15.993 15.993 0 0 0 -1.696 3.284" /><path d="M21 3a16 16 0 0 1 -4.697 9.302m-2.195 1.786a15.993 15.993 0 0 1 -3.308 1.712" /><path d="M3 3l18 18" /></svg>

After

Width:  |  Height:  |  Size: 546 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="icon icon-tabler icons-tabler-outline icon-tabler-exchange-off"><path stroke="none" d="M0 0h24v24H0z" fill="none"/><path d="M5 18m-2 0a2 2 0 1 0 4 0a2 2 0 1 0 -4 0" /><path d="M19 6m-2 0a2 2 0 1 0 4 0a2 2 0 1 0 -4 0" /><path d="M19 8v5c0 .594 -.104 1.164 -.294 1.692m-1.692 2.298a4.978 4.978 0 0 1 -3.014 1.01h-3l3 -3" /><path d="M14 21l-3 -3" /><path d="M5 16v-5c0 -1.632 .782 -3.082 1.992 -4m3.008 -1h3l-3 -3" /><path d="M11.501 7.499l1.499 -1.499" /><path d="M3 3l18 18" /></svg>

After

Width:  |  Height:  |  Size: 670 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="icon icon-tabler icons-tabler-outline icon-tabler-photo-off"><path stroke="none" d="M0 0h24v24H0z" fill="none"/><path d="M15 8h.01" /><path d="M7 3h11a3 3 0 0 1 3 3v11m-.856 3.099a2.991 2.991 0 0 1 -2.144 .901h-12a3 3 0 0 1 -3 -3v-12c0 -.845 .349 -1.608 .91 -2.153" /><path d="M3 16l5 -5c.928 -.893 2.072 -.893 3 0l5 5" /><path d="M16.33 12.338c.574 -.054 1.155 .166 1.67 .662l3 3" /><path d="M3 3l18 18" /></svg>

After

Width:  |  Height:  |  Size: 601 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="icon icon-tabler icons-tabler-outline icon-tabler-rotate-2"><path stroke="none" d="M0 0h24v24H0z" fill="none"/><path d="M15 4.55a8 8 0 0 0 -6 14.9m0 -4.45v5h-5" /><path d="M18.37 7.16l0 .01" /><path d="M13 19.94l0 .01" /><path d="M16.84 18.37l0 .01" /><path d="M19.37 15.1l0 .01" /><path d="M19.94 11l0 .01" /></svg>

After

Width:  |  Height:  |  Size: 504 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="icon icon-tabler icons-tabler-outline icon-tabler-shopping-cart-off"><path stroke="none" d="M0 0h24v24H0z" fill="none"/><path d="M6 19m-2 0a2 2 0 1 0 4 0a2 2 0 1 0 -4 0" /><path d="M17 17a2 2 0 1 0 2 2" /><path d="M17 17h-11v-11" /><path d="M9.239 5.231l10.761 .769l-1 7h-2m-4 0h-7" /><path d="M3 3l18 18" /></svg>

After

Width:  |  Height:  |  Size: 502 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="icon icon-tabler icons-tabler-outline icon-tabler-user-check"><path stroke="none" d="M0 0h24v24H0z" fill="none"/><path d="M8 7a4 4 0 1 0 8 0a4 4 0 0 0 -8 0" /><path d="M6 21v-2a4 4 0 0 1 4 -4h4" /><path d="M15 19l2 2l4 -4" /></svg>

After

Width:  |  Height:  |  Size: 419 B

View File

@@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" width="24" height="24" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2" stroke-linecap="round" stroke-linejoin="round" class="icon icon-tabler icons-tabler-outline icon-tabler-world-off"><path stroke="none" d="M0 0h24v24H0z" fill="none"/><path d="M5.657 5.615a9 9 0 1 0 12.717 12.739m1.672 -2.322a9 9 0 0 0 -12.066 -12.084" /><path d="M3.6 9h5.4m4 0h7.4" /><path d="M3.6 15h11.4m4 0h1.4" /><path d="M11.5 3a17.001 17.001 0 0 0 -1.493 3.022m-.847 3.145c-.68 4.027 .1 8.244 2.34 11.833" /><path d="M12.5 3a16.982 16.982 0 0 1 2.549 8.005m-.207 3.818a16.979 16.979 0 0 1 -2.342 6.177" /><path d="M3 3l18 18" /></svg>

After

Width:  |  Height:  |  Size: 675 B

View File

@@ -77,6 +77,7 @@ export function getApiEndpoints(modelType) {
relinkCivitai: `/api/lm/${modelType}/relink-civitai`,
civitaiVersions: `/api/lm/${modelType}/civitai/versions`,
refreshUpdates: `/api/lm/${modelType}/updates/refresh`,
fetchMissingLicenses: `/api/lm/${modelType}/updates/fetch-missing-license`,
modelUpdateStatus: `/api/lm/${modelType}/updates/status`,
modelUpdateVersions: `/api/lm/${modelType}/updates/versions`,
ignoreModelUpdate: `/api/lm/${modelType}/updates/ignore`,

View File

@@ -806,9 +806,13 @@ export class BaseModelApiClient {
params.append('recursive', pageState.searchOptions.recursive ? 'true' : 'false');
if (pageState.filters) {
if (pageState.filters.tags && pageState.filters.tags.length > 0) {
pageState.filters.tags.forEach(tag => {
params.append('tag', tag);
if (pageState.filters.tags && Object.keys(pageState.filters.tags).length > 0) {
Object.entries(pageState.filters.tags).forEach(([tag, state]) => {
if (state === 'include') {
params.append('tag_include', tag);
} else if (state === 'exclude') {
params.append('tag_exclude', tag);
}
});
}
@@ -817,6 +821,39 @@ export class BaseModelApiClient {
params.append('base_model', model);
});
}
// Add license filters
if (pageState.filters.license) {
const licenseFilters = pageState.filters.license;
if (licenseFilters.noCredit) {
// For noCredit filter:
// - 'include' means credit_required=False (no credit required)
// - 'exclude' means credit_required=True (credit required)
if (licenseFilters.noCredit === 'include') {
params.append('credit_required', 'false');
} else if (licenseFilters.noCredit === 'exclude') {
params.append('credit_required', 'true');
}
}
if (licenseFilters.allowSelling) {
// For allowSelling filter:
// - 'include' means allow_selling_generated_content=True
// - 'exclude' means allow_selling_generated_content=False
if (licenseFilters.allowSelling === 'include') {
params.append('allow_selling_generated_content', 'true');
} else if (licenseFilters.allowSelling === 'exclude') {
params.append('allow_selling_generated_content', 'false');
}
}
}
if (pageState.filters.modelTypes && pageState.filters.modelTypes.length > 0) {
pageState.filters.modelTypes.forEach((type) => {
params.append('model_type', type);
});
}
}
this._addModelSpecificParams(params, pageState);
@@ -840,6 +877,21 @@ export class BaseModelApiClient {
console.error('Error parsing lora hashes from session storage:', error);
}
}
} else if (this.modelType === 'checkpoints') {
const filterCheckpointHash = getSessionItem('recipe_to_checkpoint_filterHash');
const filterCheckpointHashes = getSessionItem('recipe_to_checkpoint_filterHashes');
if (filterCheckpointHash) {
params.append('checkpoint_hash', filterCheckpointHash);
} else if (filterCheckpointHashes) {
try {
if (Array.isArray(filterCheckpointHashes) && filterCheckpointHashes.length > 0) {
params.append('checkpoint_hashes', filterCheckpointHashes.join(','));
}
} catch (error) {
console.error('Error parsing checkpoint hashes from session storage:', error);
}
}
}
}
@@ -1215,15 +1267,24 @@ export class BaseModelApiClient {
// Start the auto-organize operation
const endpoint = this.apiConfig.endpoints.autoOrganize;
const requestOptions = {
method: filePaths ? 'POST' : 'GET',
headers: filePaths ? { 'Content-Type': 'application/json' } : {}
};
const exclusionPatterns = (state.global.settings.auto_organize_exclusions || [])
.filter(pattern => typeof pattern === 'string' && pattern.trim())
.map(pattern => pattern.trim());
const requestBody = {};
if (filePaths) {
requestOptions.body = JSON.stringify({ file_paths: filePaths });
requestBody.file_paths = filePaths;
}
if (exclusionPatterns.length > 0) {
requestBody.exclusion_patterns = exclusionPatterns;
}
const requestOptions = {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(requestBody),
};
const response = await fetch(endpoint, requestOptions);
if (!response.ok) {

View File

@@ -66,8 +66,14 @@ export async function fetchRecipesPage(page = 1, pageSize = 100) {
}
// Add tag filters
if (pageState.filters?.tags && pageState.filters.tags.length) {
params.append('tags', pageState.filters.tags.join(','));
if (pageState.filters?.tags && Object.keys(pageState.filters.tags).length) {
Object.entries(pageState.filters.tags).forEach(([tag, state]) => {
if (state === 'include') {
params.append('tag_include', tag);
} else if (state === 'exclude') {
params.append('tag_exclude', tag);
}
});
}
}

View File

@@ -1,6 +1,8 @@
import { BaseContextMenu } from './BaseContextMenu.js';
import { showToast } from '../../utils/uiHelpers.js';
import { translate } from '../../utils/i18nHelpers.js';
import { state } from '../../state/index.js';
import { getCompleteApiConfig, getCurrentModelType } from '../../api/apiConfig.js';
import { performModelUpdateCheck } from '../../utils/updateCheckHelpers.js';
export class GlobalContextMenu extends BaseContextMenu {
@@ -8,6 +10,7 @@ export class GlobalContextMenu extends BaseContextMenu {
super('globalContextMenu');
this._cleanupInProgress = false;
this._updateCheckInProgress = false;
this._licenseRefreshInProgress = false;
}
showMenu(x, y, origin = null) {
@@ -32,6 +35,11 @@ export class GlobalContextMenu extends BaseContextMenu {
console.error('Failed to check model updates:', error);
});
break;
case 'fetch-missing-licenses':
this.fetchMissingLicenses(menuItem).catch((error) => {
console.error('Failed to refresh missing license metadata:', error);
});
break;
default:
console.warn(`Unhandled global context menu action: ${action}`);
break;
@@ -133,4 +141,98 @@ export class GlobalContextMenu extends BaseContextMenu {
}
}
}
async fetchMissingLicenses(menuItem) {
if (this._licenseRefreshInProgress) {
return;
}
const modelType = getCurrentModelType();
const apiConfig = getCompleteApiConfig(modelType);
const displayName = apiConfig?.config?.displayName ?? 'Model';
const typePlural = this._buildTypePlural(displayName);
const loadingMessage = translate(
'globalContextMenu.fetchMissingLicenses.loading',
{ type: displayName, typePlural },
`Refreshing license metadata for ${typePlural}...`
);
const endpoint = apiConfig?.endpoints?.fetchMissingLicenses;
if (!endpoint) {
console.warn('Fetch missing license endpoint not configured for model type:', modelType);
showToast(
'globalContextMenu.fetchMissingLicenses.error',
{ message: 'Endpoint unavailable', type: displayName, typePlural },
'warning'
);
return;
}
this._licenseRefreshInProgress = true;
menuItem?.classList?.add('disabled');
state.loadingManager?.showSimpleLoading?.(loadingMessage);
try {
const response = await fetch(endpoint, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({}),
});
let payload = {};
try {
payload = await response.json();
} catch {
payload = {};
}
if (!response.ok || payload.success !== true) {
const errorMessage = payload?.error || response.statusText || 'Unknown error';
throw new Error(errorMessage);
}
const updated = Array.isArray(payload.updated) ? payload.updated : [];
if (updated.length > 0) {
showToast(
'globalContextMenu.fetchMissingLicenses.success',
{ count: updated.length, type: displayName, typePlural },
'success'
);
} else {
showToast(
'globalContextMenu.fetchMissingLicenses.none',
{ type: displayName, typePlural },
'info'
);
}
} catch (error) {
console.error('Failed to refresh missing license metadata:', error);
showToast(
'globalContextMenu.fetchMissingLicenses.error',
{ message: error?.message ?? 'Unknown error', type: displayName, typePlural },
'error'
);
} finally {
state.loadingManager?.hide?.();
if (typeof state.loadingManager?.restoreProgressBar === 'function') {
state.loadingManager.restoreProgressBar();
}
this._licenseRefreshInProgress = false;
menuItem?.classList?.remove('disabled');
}
}
_buildTypePlural(displayName) {
if (!displayName) {
return 'models';
}
const lower = displayName.toLowerCase();
if (lower.endsWith('s')) {
return displayName;
}
return `${displayName}s`;
}
}

View File

@@ -1,8 +1,10 @@
import { showToast, getNSFWLevelName, openExampleImagesFolder } from '../../utils/uiHelpers.js';
import { modalManager } from '../../managers/ModalManager.js';
import { state } from '../../state/index.js';
import { getModelApiClient } from '../../api/modelApiFactory.js';
import { getModelApiClient, resetAndReload } from '../../api/modelApiFactory.js';
import { bulkManager } from '../../managers/BulkManager.js';
import { MODEL_CONFIG } from '../../api/apiConfig.js';
import { translate } from '../../utils/i18nHelpers.js';
// Mixin with shared functionality for LoraContextMenu and CheckpointContextMenu
export const ModelContextMenuMixin = {
@@ -245,7 +247,88 @@ export const ModelContextMenuMixin = {
return { modelId: null, modelVersionId: null };
}
},
parseModelId(value) {
if (value === undefined || value === null || value === '') {
return null;
}
const parsed = Number.parseInt(value, 10);
return Number.isNaN(parsed) ? null : parsed;
},
getModelIdFromCard(card) {
if (!card) {
return null;
}
if (card.dataset?.meta) {
try {
const meta = JSON.parse(card.dataset.meta);
const metaValue = this.parseModelId(meta?.modelId);
if (metaValue !== null) {
return metaValue;
}
} catch (error) {
console.warn('Unable to parse card metadata for model ID', error);
}
}
return null;
},
async checkUpdatesForCurrentModel() {
const card = this.currentCard;
if (!card) {
return;
}
const modelId = this.getModelIdFromCard(card);
const typeConfig = MODEL_CONFIG[this.modelType] || {};
const typeLabel = (typeConfig.displayName || 'Model').toLowerCase();
if (modelId === null) {
showToast('toast.models.bulkUpdatesMissing', { type: typeLabel }, 'warning');
return;
}
const apiClient = getModelApiClient();
const loadingMessage = translate(
'toast.models.bulkUpdatesChecking',
{ count: 1, type: typeLabel },
`Checking selected ${typeLabel}(s) for updates...`
);
state.loadingManager.showSimpleLoading(loadingMessage);
try {
const response = await apiClient.refreshUpdatesForModels([modelId]);
const records = Array.isArray(response?.records) ? response.records : [];
const updatesCount = records.length;
if (updatesCount > 0) {
showToast('toast.models.bulkUpdatesSuccess', { count: updatesCount, type: typeLabel }, 'success');
} else {
showToast('toast.models.bulkUpdatesNone', { type: typeLabel }, 'info');
}
const resetFn = this.resetAndReload || resetAndReload;
if (typeof resetFn === 'function') {
await resetFn(false);
}
} catch (error) {
console.error('Error checking updates for model:', error);
showToast(
'toast.models.bulkUpdatesFailed',
{ type: typeLabel, message: error?.message ?? 'Unknown error' },
'error'
);
} finally {
state.loadingManager.hide();
state.loadingManager.restoreProgressBar();
}
},
// Common action handlers
handleCommonMenuActions(action) {
switch(action) {
@@ -272,6 +355,9 @@ export const ModelContextMenuMixin = {
case 'set-nsfw':
this.showNSFWLevelSelector(null, null, this.currentCard);
return true;
case 'check-updates':
this.checkUpdatesForCurrentModel();
return true;
default:
return false;
}

View File

@@ -3,7 +3,7 @@ import { showToast, copyToClipboard, sendLoraToWorkflow } from '../utils/uiHelpe
import { modalManager } from '../managers/ModalManager.js';
import { getCurrentPageState } from '../state/index.js';
import { state } from '../state/index.js';
import { NSFW_LEVELS } from '../utils/constants.js';
import { NSFW_LEVELS, getBaseModelAbbreviation } from '../utils/constants.js';
class RecipeCard {
constructor(recipe, clickHandler) {
@@ -24,8 +24,10 @@ class RecipeCard {
card.dataset.created = this.recipe.created_date;
card.dataset.id = this.recipe.id || '';
// Get base model
const baseModel = this.recipe.base_model || '';
// Get base model with fallback
const baseModelLabel = (this.recipe.base_model || '').trim() || 'Unknown';
const baseModelAbbreviation = getBaseModelAbbreviation(baseModelLabel);
const baseModelDisplay = baseModelLabel === 'Unknown' ? 'Unknown' : baseModelAbbreviation;
// Ensure loras array exists
const loras = this.recipe.loras || [];
@@ -71,7 +73,7 @@ class RecipeCard {
`<button class="toggle-blur-btn" title="Toggle blur">
<i class="fas fa-eye"></i>
</button>` : ''}
${baseModel ? `<span class="base-model-label ${shouldBlur ? 'with-toggle' : ''}" title="${baseModel}">${baseModel}</span>` : ''}
<span class="base-model-label ${shouldBlur ? 'with-toggle' : ''}" title="${baseModelLabel}">${baseModelDisplay}</span>
<div class="card-actions">
<i class="fas fa-share-alt" title="Share Recipe"></i>
<i class="fas fa-paper-plane" title="Send Recipe to Workflow (Click: Append, Shift+Click: Replace)"></i>
@@ -376,4 +378,4 @@ class RecipeCard {
}
}
export { RecipeCard };
export { RecipeCard };

View File

@@ -1,8 +1,11 @@
// Recipe Modal Component
import { showToast, copyToClipboard } from '../utils/uiHelpers.js';
import { showToast, copyToClipboard, sendModelPathToWorkflow } from '../utils/uiHelpers.js';
import { translate } from '../utils/i18nHelpers.js';
import { state } from '../state/index.js';
import { setSessionItem, removeSessionItem } from '../utils/storageHelpers.js';
import { updateRecipeMetadata } from '../api/recipeApi.js';
import { downloadManager } from '../managers/DownloadManager.js';
import { MODEL_TYPES } from '../api/apiConfig.js';
class RecipeModal {
constructor() {
@@ -339,6 +342,18 @@ class RecipeModal {
if (negativePromptElement) promptElement.textContent = 'No negative prompt information available';
if (otherParamsElement) otherParamsElement.innerHTML = '<div class="no-params">No parameters available</div>';
}
const checkpointContainer = document.getElementById('recipeCheckpoint');
const resourceDivider = document.getElementById('recipeResourceDivider');
if (checkpointContainer) {
checkpointContainer.innerHTML = '';
if (recipe.checkpoint && typeof recipe.checkpoint === 'object') {
checkpointContainer.innerHTML = this.renderCheckpoint(recipe.checkpoint);
this.setupCheckpointActions(checkpointContainer, recipe.checkpoint);
this.setupCheckpointNavigation(checkpointContainer, recipe.checkpoint);
}
}
// Set LoRAs list and count
const lorasListElement = document.getElementById('recipeLorasList');
@@ -492,6 +507,12 @@ class RecipeModal {
lorasListElement.innerHTML = '<div class="no-loras">No LoRAs associated with this recipe</div>';
this.recipeLorasSyntax = '';
}
if (resourceDivider) {
const hasCheckpoint = checkpointContainer && checkpointContainer.querySelector('.recipe-lora-item');
const hasLoraItems = lorasListElement && lorasListElement.querySelector('.recipe-lora-item');
resourceDivider.style.display = hasCheckpoint && hasLoraItems ? 'block' : 'none';
}
// Show the modal
modalManager.showModal('recipeModal');
@@ -1047,8 +1068,222 @@ class RecipeModal {
}
}
renderCheckpoint(checkpoint) {
const existsLocally = !!checkpoint.inLibrary;
const localPath = checkpoint.localPath || '';
const previewUrl = checkpoint.preview_url || checkpoint.thumbnailUrl || '/loras_static/images/no-preview.png';
const isPreviewVideo = typeof previewUrl === 'string' && previewUrl.toLowerCase().endsWith('.mp4');
const checkpointName = checkpoint.name || checkpoint.modelName || checkpoint.file_name || 'Checkpoint';
const versionLabel = checkpoint.version || checkpoint.modelVersionName || '';
const baseModel = checkpoint.baseModel || checkpoint.base_model || '';
const modelTypeRaw = (checkpoint.model_type || checkpoint.type || 'checkpoint').toLowerCase();
const modelTypeLabel = modelTypeRaw === 'diffusion_model' ? 'Diffusion Model' : 'Checkpoint';
const previewMedia = isPreviewVideo ? `
<video class="thumbnail-video" autoplay loop muted playsinline>
<source src="${previewUrl}" type="video/mp4">
</video>
` : `<img src="${previewUrl}" alt="Checkpoint preview">`;
const badge = existsLocally ? `
<div class="local-badge">
<i class="fas fa-check"></i> In Library
<div class="local-path">${localPath}</div>
</div>
` : `
<div class="missing-badge">
<i class="fas fa-exclamation-triangle"></i> Not in Library
</div>
`;
let headerAction = '';
if (existsLocally && localPath) {
headerAction = `
<button class="resource-action primary compact checkpoint-send">
<i class="fas fa-paper-plane"></i>
<span>${translate('recipes.actions.sendCheckpoint', {}, 'Send to ComfyUI')}</span>
</button>
`;
} else if (this.canDownloadCheckpoint(checkpoint)) {
headerAction = `
<button class="resource-action primary compact checkpoint-download">
<i class="fas fa-download"></i>
<span>${translate('modals.model.versions.actions.download', {}, 'Download')}</span>
</button>
`;
}
return `
<div class="recipe-lora-item checkpoint-item ${existsLocally ? 'exists-locally' : 'missing-locally'}">
<div class="recipe-lora-thumbnail">
${previewMedia}
</div>
<div class="recipe-lora-content">
<div class="recipe-lora-header">
<h4>${checkpointName}</h4>
<div class="badge-container">${headerAction}</div>
</div>
<div class="recipe-lora-info recipe-checkpoint-meta">
${versionLabel ? `<div class="recipe-lora-version">${versionLabel}</div>` : ''}
${baseModel ? `<div class="base-model">${baseModel}</div>` : ''}
${modelTypeLabel ? `<div class="checkpoint-type">${modelTypeLabel}</div>` : ''}
</div>
</div>
</div>
`;
}
setupCheckpointActions(container, checkpoint) {
const sendBtn = container.querySelector('.checkpoint-send');
if (sendBtn) {
sendBtn.addEventListener('click', (e) => {
e.stopPropagation();
this.sendCheckpointToWorkflow(checkpoint);
});
}
const downloadBtn = container.querySelector('.checkpoint-download');
if (downloadBtn) {
downloadBtn.addEventListener('click', async (e) => {
e.stopPropagation();
await this.downloadCheckpoint(checkpoint, downloadBtn);
});
}
}
setupCheckpointNavigation(container, checkpoint) {
const checkpointItem = container.querySelector('.checkpoint-item');
if (!checkpointItem) return;
checkpointItem.addEventListener('click', () => {
this.navigateToCheckpointPage(checkpoint);
});
}
canDownloadCheckpoint(checkpoint) {
if (!checkpoint) return false;
const modelId = checkpoint.modelId || checkpoint.modelID || checkpoint.model_id;
const versionId = checkpoint.id || checkpoint.modelVersionId;
return !!(modelId && versionId);
}
async sendCheckpointToWorkflow(checkpoint) {
if (!checkpoint || !checkpoint.localPath) {
showToast('toast.recipes.missingCheckpointPath', {}, 'error');
return;
}
const modelType = (checkpoint.model_type || checkpoint.type || 'checkpoint').toLowerCase();
const isDiffusionModel = modelType === 'diffusion_model' || modelType === 'unet';
const widgetName = isDiffusionModel ? 'unet_name' : 'ckpt_name';
const actionTypeText = translate(
isDiffusionModel ? 'uiHelpers.nodeSelector.diffusionModel' : 'uiHelpers.nodeSelector.checkpoint',
{},
isDiffusionModel ? 'Diffusion Model' : 'Checkpoint'
);
const successMessage = translate(
isDiffusionModel ? 'uiHelpers.workflow.diffusionModelUpdated' : 'uiHelpers.workflow.checkpointUpdated',
{},
isDiffusionModel ? 'Diffusion model updated in workflow' : 'Checkpoint updated in workflow'
);
const failureMessage = translate(
isDiffusionModel ? 'uiHelpers.workflow.diffusionModelFailed' : 'uiHelpers.workflow.checkpointFailed',
{},
isDiffusionModel ? 'Failed to update diffusion model node' : 'Failed to update checkpoint node'
);
const missingNodesMessage = translate(
'uiHelpers.workflow.noMatchingNodes',
{},
'No compatible nodes available in the current workflow'
);
const missingTargetMessage = translate(
'uiHelpers.workflow.noTargetNodeSelected',
{},
'No target node selected'
);
await sendModelPathToWorkflow(checkpoint.localPath, {
widgetName,
collectionType: MODEL_TYPES.CHECKPOINT,
actionTypeText,
successMessage,
failureMessage,
missingNodesMessage,
missingTargetMessage,
});
}
async downloadCheckpoint(checkpoint, button) {
if (!this.canDownloadCheckpoint(checkpoint)) {
showToast('toast.recipes.missingCheckpointInfo', {}, 'error');
return;
}
const modelId = checkpoint.modelId || checkpoint.modelID || checkpoint.model_id;
const versionId = checkpoint.id || checkpoint.modelVersionId;
const versionName = checkpoint.version || checkpoint.modelVersionName || checkpoint.name || 'Checkpoint';
if (button) {
button.disabled = true;
}
try {
await downloadManager.downloadVersionWithDefaults(
MODEL_TYPES.CHECKPOINT,
modelId,
versionId,
{
versionName,
source: 'recipe-modal',
}
);
} catch (error) {
console.error('Error downloading checkpoint:', error);
showToast('toast.recipes.downloadCheckpointFailed', { message: error.message }, 'error');
} finally {
if (button) {
button.disabled = false;
}
}
}
navigateToCheckpointPage(checkpoint) {
const checkpointHash = this._getCheckpointHash(checkpoint);
if (!checkpointHash) {
showToast('toast.recipes.missingCheckpointInfo', {}, 'error');
return;
}
modalManager.closeModal('recipeModal');
removeSessionItem('recipe_to_checkpoint_filterHash');
removeSessionItem('recipe_to_checkpoint_filterHashes');
removeSessionItem('filterCheckpointRecipeName');
setSessionItem('recipe_to_checkpoint_filterHash', checkpointHash.toLowerCase());
if (this.currentRecipe?.title) {
setSessionItem('filterCheckpointRecipeName', this.currentRecipe.title);
}
window.location.href = '/checkpoints';
}
_getCheckpointHash(checkpoint) {
if (!checkpoint) return '';
const hash =
checkpoint.hash ||
checkpoint.sha256 ||
checkpoint.sha256_hash ||
checkpoint.sha256Hash ||
checkpoint.SHA256;
return hash ? hash.toString() : '';
}
// New method to navigate to the LoRAs page
navigateToLorasPage(specificLoraIndex = null) {
debugger;
// Close the current modal
modalManager.closeModal('recipeModal');
@@ -1087,7 +1322,7 @@ class RecipeModal {
// New method to make LoRA items clickable
setupLoraItemsClickable() {
const loraItems = document.querySelectorAll('.recipe-lora-item');
const loraItems = document.querySelectorAll('.recipe-lora-item:not(.checkpoint-item)');
loraItems.forEach(item => {
// Get the lora index from the data attribute
const loraIndex = parseInt(item.dataset.loraIndex);
@@ -1107,4 +1342,4 @@ class RecipeModal {
}
}
export { RecipeModal };
export { RecipeModal };

View File

@@ -1,7 +1,7 @@
// CheckpointsControls.js - Specific implementation for the Checkpoints page
import { PageControls } from './PageControls.js';
import { getModelApiClient, resetAndReload } from '../../api/modelApiFactory.js';
import { showToast } from '../../utils/uiHelpers.js';
import { getSessionItem, removeSessionItem } from '../../utils/storageHelpers.js';
import { downloadManager } from '../../managers/DownloadManager.js';
/**
@@ -14,6 +14,9 @@ export class CheckpointsControls extends PageControls {
// Register API methods specific to the Checkpoints page
this.registerCheckpointsAPI();
// Check for custom filters (e.g., from recipe navigation)
this.checkCustomFilters();
}
/**
@@ -52,14 +55,65 @@ export class CheckpointsControls extends PageControls {
}
},
// No clearCustomFilter implementation is needed for checkpoints
// as custom filters are currently only used for LoRAs
clearCustomFilter: async () => {
showToast('toast.filters.noCustomFilterToClear', {}, 'info');
await this.clearCustomFilter();
}
};
// Register the API
this.registerAPI(checkpointsAPI);
}
}
/**
* Check for custom filters sent from other pages (e.g., recipe modal)
*/
checkCustomFilters() {
const filterCheckpointHash = getSessionItem('recipe_to_checkpoint_filterHash');
const filterRecipeName = getSessionItem('filterCheckpointRecipeName');
if (filterCheckpointHash && filterRecipeName) {
const indicator = document.getElementById('customFilterIndicator');
const filterText = indicator?.querySelector('.customFilterText');
if (indicator && filterText) {
indicator.classList.remove('hidden');
const displayText = `Viewing checkpoint from: ${filterRecipeName}`;
filterText.textContent = this._truncateText(displayText, 30);
filterText.setAttribute('title', displayText);
const filterElement = indicator.querySelector('.filter-active');
if (filterElement) {
filterElement.classList.add('animate');
setTimeout(() => filterElement.classList.remove('animate'), 600);
}
}
}
}
/**
* Clear checkpoint custom filter and reload
*/
async clearCustomFilter() {
removeSessionItem('recipe_to_checkpoint_filterHash');
removeSessionItem('recipe_to_checkpoint_filterHashes');
removeSessionItem('filterCheckpointRecipeName');
const indicator = document.getElementById('customFilterIndicator');
if (indicator) {
indicator.classList.add('hidden');
}
await resetAndReload();
}
/**
* Helper to truncate text with ellipsis
* @param {string} text
* @param {number} maxLength
* @returns {string}
*/
_truncateText(text, maxLength) {
return text.length > maxLength ? `${text.substring(0, maxLength - 3)}...` : text;
}
}

View File

@@ -38,52 +38,57 @@ function updateModalFilePathReferences(newFilePath) {
}
const modalElement = document.getElementById('modelModal');
if (modalElement) {
modalElement.dataset.filePath = newFilePath;
modalElement.setAttribute('data-file-path', newFilePath);
if (!modalElement) {
return;
}
const modelNameContent = document.querySelector('.model-name-content');
modalElement.dataset.filePath = newFilePath;
modalElement.setAttribute('data-file-path', newFilePath);
const scopedQuery = (selector) => modalElement.querySelector(selector);
const scopedQueryAll = (selector) => modalElement.querySelectorAll(selector);
const modelNameContent = scopedQuery('.model-name-content');
if (modelNameContent && modelNameContent.dataset) {
modelNameContent.dataset.filePath = newFilePath;
modelNameContent.setAttribute('data-file-path', newFilePath);
}
const baseModelContent = document.querySelector('.base-model-content');
const baseModelContent = scopedQuery('.base-model-content');
if (baseModelContent && baseModelContent.dataset) {
baseModelContent.dataset.filePath = newFilePath;
baseModelContent.setAttribute('data-file-path', newFilePath);
}
const fileNameContent = document.querySelector('.file-name-content');
const fileNameContent = scopedQuery('.file-name-content');
if (fileNameContent && fileNameContent.dataset) {
fileNameContent.dataset.filePath = newFilePath;
fileNameContent.setAttribute('data-file-path', newFilePath);
}
const editTagsBtn = document.querySelector('.edit-tags-btn');
const editTagsBtn = scopedQuery('.edit-tags-btn');
if (editTagsBtn) {
editTagsBtn.dataset.filePath = newFilePath;
editTagsBtn.setAttribute('data-file-path', newFilePath);
}
const editTriggerWordsBtn = document.querySelector('.edit-trigger-words-btn');
const editTriggerWordsBtn = scopedQuery('.edit-trigger-words-btn');
if (editTriggerWordsBtn) {
editTriggerWordsBtn.dataset.filePath = newFilePath;
editTriggerWordsBtn.setAttribute('data-file-path', newFilePath);
}
document.querySelectorAll('[data-action="open-file-location"]').forEach((el) => {
scopedQueryAll('[data-action="open-file-location"]').forEach((el) => {
el.dataset.filepath = newFilePath;
el.setAttribute('data-filepath', newFilePath);
});
document.querySelectorAll('[data-file-path]').forEach((el) => {
scopedQueryAll('[data-file-path]').forEach((el) => {
el.dataset.filePath = newFilePath;
el.setAttribute('data-file-path', newFilePath);
});
document.querySelectorAll('[data-filepath]').forEach((el) => {
scopedQueryAll('[data-filepath]').forEach((el) => {
el.dataset.filepath = newFilePath;
el.setAttribute('data-filepath', newFilePath);
});

View File

@@ -29,6 +29,166 @@ function getModalFilePath(fallback = '') {
return fallback;
}
const COMMERCIAL_ICON_CONFIG = [
{
key: 'image',
icon: 'photo-off.svg',
titleKey: 'modals.model.license.noImageSell',
fallback: 'No selling generated content'
},
{
key: 'rentcivit',
icon: 'brush-off.svg',
titleKey: 'modals.model.license.noRentCivit',
fallback: 'No Civitai generation'
},
{
key: 'rent',
icon: 'world-off.svg',
titleKey: 'modals.model.license.noRent',
fallback: 'No generation services'
},
{
key: 'sell',
icon: 'shopping-cart-off.svg',
titleKey: 'modals.model.license.noSell',
fallback: 'No selling models'
}
];
function hasLicenseField(license, field) {
return Object.prototype.hasOwnProperty.call(license || {}, field);
}
function escapeAttribute(value) {
return String(value ?? '')
.replace(/&/g, '&amp;')
.replace(/"/g, '&quot;');
}
function indentMarkup(markup, spaces) {
if (!markup) {
return '';
}
const padding = ' '.repeat(spaces);
return markup
.split('\n')
.map(line => (line ? padding + line : line))
.join('\n');
}
function normalizeCommercialValues(value) {
if (!value && value !== '') {
return ['Sell'];
}
if (Array.isArray(value)) {
return value.filter(item => item !== null && item !== undefined);
}
if (typeof value === 'string') {
return [value];
}
if (value && typeof value[Symbol.iterator] === 'function') {
const result = [];
for (const item of value) {
if (item === null || item === undefined) {
continue;
}
result.push(String(item));
}
if (result.length > 0) {
return result;
}
}
return ['Sell'];
}
function sanitiseCommercialValue(value) {
if (!value && value !== '') {
return '';
}
return String(value)
.trim()
.toLowerCase()
.replace(/[\s_-]+/g, '')
.replace(/[^a-z]/g, '');
}
function resolveCommercialRestrictions(value) {
const normalizedValues = normalizeCommercialValues(value);
const allowed = new Set();
normalizedValues.forEach(item => {
const cleaned = sanitiseCommercialValue(item);
if (!cleaned) {
return;
}
allowed.add(cleaned);
});
if (allowed.has('sell')) {
allowed.add('rent');
allowed.add('rentcivit');
allowed.add('image');
}
if (allowed.has('rent')) {
allowed.add('rentcivit');
}
const disallowed = [];
COMMERCIAL_ICON_CONFIG.forEach(config => {
if (!allowed.has(config.key)) {
disallowed.push(config);
}
});
return disallowed;
}
function createLicenseIconMarkup(icon, label) {
const safeLabel = escapeAttribute(label);
const iconPath = `/loras_static/images/tabler/${icon}`;
return `<span class="license-icon" role="img" aria-label="${safeLabel}" title="${safeLabel}" style="--license-icon-image: url('${iconPath}')"></span>`;
}
function renderLicenseIcons(modelData) {
const license = modelData?.civitai?.model;
if (!license) {
return '';
}
const icons = [];
if (hasLicenseField(license, 'allowNoCredit') && license.allowNoCredit === false) {
const label = translate('modals.model.license.creditRequired', {}, 'Creator credit required');
icons.push(createLicenseIconMarkup('user-check.svg', label));
}
if (hasLicenseField(license, 'allowCommercialUse')) {
const restrictions = resolveCommercialRestrictions(license.allowCommercialUse);
restrictions.forEach(({ icon, titleKey, fallback }) => {
const label = translate(titleKey, {}, fallback);
icons.push(createLicenseIconMarkup(icon, label));
});
}
if (hasLicenseField(license, 'allowDerivatives') && license.allowDerivatives === false) {
const label = translate('modals.model.license.noDerivatives', {}, 'No sharing merges');
icons.push(createLicenseIconMarkup('exchange-off.svg', label));
}
if (hasLicenseField(license, 'allowDifferentLicense') && license.allowDifferentLicense === false) {
const label = translate('modals.model.license.noReLicense', {}, 'Same permissions required');
icons.push(createLicenseIconMarkup('rotate-2.svg', label));
}
if (!icons.length) {
return '';
}
const containerLabel = translate('modals.model.license.restrictionsLabel', {}, 'License restrictions');
const safeContainerLabel = escapeAttribute(containerLabel);
return `<div class="license-restrictions" aria-label="${safeContainerLabel}" role="group">
${icons.join('\n ')}
</div>`;
}
/**
* Display the model modal with the given model data
* @param {Object} model - Model data object
@@ -55,6 +215,51 @@ export async function showModelModal(model, modelType) {
...model,
civitai: completeCivitaiData
};
const licenseIcons = renderLicenseIcons(modelWithFullData);
const viewOnCivitaiAction = modelWithFullData.from_civitai ? `
<div class="civitai-view" title="${translate('modals.model.actions.viewOnCivitai', {}, 'View on Civitai')}" data-action="view-civitai" data-filepath="${modelWithFullData.file_path}">
<i class="fas fa-globe"></i> ${translate('modals.model.actions.viewOnCivitaiText', {}, 'View on Civitai')}
</div>`.trim() : '';
const creatorInfoAction = modelWithFullData.civitai?.creator ? `
<div class="creator-info" data-username="${modelWithFullData.civitai.creator.username}" data-action="view-creator" title="${translate('modals.model.actions.viewCreatorProfile', {}, 'View Creator Profile')}">
${modelWithFullData.civitai.creator.image ?
`<div class="creator-avatar">
<img src="${modelWithFullData.civitai.creator.image}" alt="${modelWithFullData.civitai.creator.username}" onerror="this.onerror=null; this.src='/loras_static/icons/user-placeholder.png';">
</div>` :
`<div class="creator-avatar creator-placeholder">
<i class="fas fa-user"></i>
</div>`
}
<span class="creator-username">${modelWithFullData.civitai.creator.username}</span>
</div>`.trim() : '';
const creatorActionItems = [];
if (viewOnCivitaiAction) {
creatorActionItems.push(indentMarkup(viewOnCivitaiAction, 24));
}
if (creatorInfoAction) {
creatorActionItems.push(indentMarkup(creatorInfoAction, 24));
}
const creatorActionsMarkup = creatorActionItems.length
? [
' <div class="creator-actions">',
creatorActionItems.join('\n'),
' </div>'
].join('\n')
: '';
const headerActionItems = [];
if (creatorActionsMarkup) {
headerActionItems.push(creatorActionsMarkup);
}
if (licenseIcons) {
headerActionItems.push(indentMarkup(licenseIcons.trim(), 20));
}
const headerActionsMarkup = headerActionItems.length
? [
' <div class="modal-header-actions">',
headerActionItems.join('\n'),
' </div>'
].join('\n')
: '';
const hasUpdateAvailable = Boolean(modelWithFullData.update_available);
// Prepare LoRA specific data with complete civitai data
@@ -172,25 +377,7 @@ export async function showModelModal(model, modelType) {
</button>
</div>
<div class="creator-actions">
${modelWithFullData.from_civitai ? `
<div class="civitai-view" title="${translate('modals.model.actions.viewOnCivitai', {}, 'View on Civitai')}" data-action="view-civitai" data-filepath="${modelWithFullData.file_path}">
<i class="fas fa-globe"></i> ${translate('modals.model.actions.viewOnCivitaiText', {}, 'View on Civitai')}
</div>` : ''}
${modelWithFullData.civitai?.creator ? `
<div class="creator-info" data-username="${modelWithFullData.civitai.creator.username}" data-action="view-creator" title="${translate('modals.model.actions.viewCreatorProfile', {}, 'View Creator Profile')}">
${modelWithFullData.civitai.creator.image ?
`<div class="creator-avatar">
<img src="${modelWithFullData.civitai.creator.image}" alt="${modelWithFullData.civitai.creator.username}" onerror="this.onerror=null; this.src='static/icons/user-placeholder.png';">
</div>` :
`<div class="creator-avatar creator-placeholder">
<i class="fas fa-user"></i>
</div>`
}
<span class="creator-username">${modelWithFullData.civitai.creator.username}</span>
</div>` : ''}
</div>
${headerActionsMarkup}
${renderCompactTags(modelWithFullData.tags || [], modelWithFullData.file_path)}
</header>
@@ -267,6 +454,8 @@ export async function showModelModal(model, modelType) {
</div>
`;
let showcaseCleanup;
const onCloseCallback = function() {
// Clean up all handlers when modal closes for LoRA
const modalElement = document.getElementById(modalId);
@@ -274,6 +463,10 @@ export async function showModelModal(model, modelType) {
modalElement.removeEventListener('click', modalElement._clickHandler);
delete modalElement._clickHandler;
}
if (showcaseCleanup) {
showcaseCleanup();
showcaseCleanup = null;
}
};
modalManager.showModal(modalId, content, null, onCloseCallback);
@@ -288,7 +481,7 @@ export async function showModelModal(model, modelType) {
currentVersionId: civitaiVersionId,
});
setupEditableFields(modelWithFullData.file_path, modelType);
setupShowcaseScroll(modalId);
showcaseCleanup = setupShowcaseScroll(modalId);
setupTabSwitching({
onTabChange: async (tab) => {
if (tab === 'versions') {

View File

@@ -7,6 +7,7 @@ import { state } from '../../state/index.js';
import { formatFileSize } from './utils.js';
const VIDEO_EXTENSIONS = ['.mp4', '.webm', '.mov', '.mkv'];
const PREVIEW_PLACEHOLDER_URL = '/loras_static/images/no-preview.png';
function buildCivitaiVersionUrl(modelId, versionId) {
if (modelId == null || versionId == null) {
@@ -152,6 +153,81 @@ function buildBadge(label, tone) {
return `<span class="version-badge version-badge-${tone}">${escapeHtml(label)}</span>`;
}
const DISPLAY_FILTER_MODES = Object.freeze({
SAME_BASE: 'same_base',
ANY: 'any',
});
const FILTER_LABEL_KEY = 'modals.model.versions.filters.label';
const FILTER_STATE_KEYS = {
[DISPLAY_FILTER_MODES.SAME_BASE]: 'modals.model.versions.filters.state.showSameBase',
[DISPLAY_FILTER_MODES.ANY]: 'modals.model.versions.filters.state.showAll',
};
const FILTER_TOOLTIP_KEYS = {
[DISPLAY_FILTER_MODES.SAME_BASE]: 'modals.model.versions.filters.tooltip.showAllVersions',
[DISPLAY_FILTER_MODES.ANY]: 'modals.model.versions.filters.tooltip.showSameBaseVersions',
};
function normalizeBaseModelName(value) {
if (typeof value !== 'string') {
return null;
}
const trimmed = value.trim();
if (!trimmed) {
return null;
}
return trimmed.toLowerCase();
}
function getToggleLabelText() {
return translate(FILTER_LABEL_KEY, {}, 'Base filter');
}
function getToggleStateText(mode) {
const key = FILTER_STATE_KEYS[mode] || FILTER_STATE_KEYS[DISPLAY_FILTER_MODES.ANY];
const fallback =
mode === DISPLAY_FILTER_MODES.SAME_BASE ? 'Same base' : 'All versions';
return translate(key, {}, fallback);
}
function getToggleTooltipText(mode) {
const key =
FILTER_TOOLTIP_KEYS[mode] || FILTER_TOOLTIP_KEYS[DISPLAY_FILTER_MODES.ANY];
const fallback =
mode === DISPLAY_FILTER_MODES.SAME_BASE
? 'Switch to showing all versions'
: 'Switch to showing only versions with the current base model';
return translate(key, {}, fallback);
}
function getDefaultDisplayMode() {
const strategy = state?.global?.settings?.update_flag_strategy;
return strategy === DISPLAY_FILTER_MODES.SAME_BASE
? DISPLAY_FILTER_MODES.SAME_BASE
: DISPLAY_FILTER_MODES.ANY;
}
function getCurrentVersionBaseModel(record, versionId) {
if (!record || typeof versionId !== 'number' || !Array.isArray(record.versions)) {
return {
normalized: null,
raw: null,
};
}
const currentVersion = record.versions.find(v => v.versionId === versionId);
if (!currentVersion) {
return {
normalized: null,
raw: null,
};
}
const baseModelRaw = currentVersion.baseModel ?? null;
return {
normalized: normalizeBaseModelName(baseModelRaw),
raw: baseModelRaw,
};
}
function getAutoplaySetting() {
try {
return Boolean(state?.global?.settings?.autoplay_on_hover);
@@ -190,6 +266,25 @@ function renderMediaMarkup(version) {
`;
}
function renderDeletePreview(version, versionName) {
const previewUrl = version?.previewUrl;
if (previewUrl && isVideoUrl(previewUrl)) {
return `
<video
src="${escapeHtml(previewUrl)}"
controls
muted
loop
playsinline
preload="metadata"
></video>
`;
}
const imageUrl = previewUrl || PREVIEW_PLACEHOLDER_URL;
return `<img src="${escapeHtml(imageUrl)}" alt="${escapeHtml(versionName)}" onerror="this.src='${PREVIEW_PLACEHOLDER_URL}'">`;
}
function renderRow(version, options) {
const { latestLibraryVersionId, currentVersionId, modelId: parentModelId } = options;
const isCurrent = currentVersionId && version.versionId === currentVersionId;
@@ -314,7 +409,7 @@ function getLatestLibraryVersionId(record) {
return Math.max(...record.inLibraryVersionIds);
}
function renderToolbar(record) {
function renderToolbar(record, toolbarState = {}) {
const ignoreText = record.shouldIgnore
? translate('modals.model.versions.actions.resumeModelUpdates', {}, 'Resume updates for this model')
: translate('modals.model.versions.actions.ignoreModelUpdates', {}, 'Ignore updates for this model');
@@ -325,10 +420,23 @@ function renderToolbar(record) {
'Track and manage every version of this model in one place.'
);
const displayMode = toolbarState.displayMode || DISPLAY_FILTER_MODES.ANY;
const toggleLabel = getToggleLabelText();
const toggleState = getToggleStateText(displayMode);
const toggleTooltip = getToggleTooltipText(displayMode);
const filterActive = toolbarState.isFilteringActive ? 'true' : 'false';
const screenReaderText = [toggleLabel, toggleState].filter(Boolean).join(': ');
return `
<header class="versions-toolbar">
<div class="versions-toolbar-info">
<h3>${translate('modals.model.versions.heading', {}, 'Model versions')}</h3>
<div class="versions-toolbar-info-heading">
<h3>${translate('modals.model.versions.heading', {}, 'Model versions')}</h3>
<button class="versions-filter-toggle" data-versions-action="toggle-version-display-mode" type="button" title="${escapeHtml(toggleTooltip)}" aria-label="${escapeHtml(toggleTooltip)}" data-filter-active="${filterActive}" aria-pressed="${filterActive}">
<i class="fas fa-th-list" aria-hidden="true"></i>
<span class="sr-only">${escapeHtml(screenReaderText)}</span>
</button>
</div>
<p>${escapeHtml(infoText)}</p>
</div>
<div class="versions-toolbar-actions">
@@ -353,6 +461,20 @@ function renderEmptyState(container) {
`;
}
function renderFilteredEmptyState(baseModelLabel) {
const message = translate(
'modals.model.versions.filters.empty',
{ baseModel: baseModelLabel },
'No versions match the current base model filter.'
);
return `
<div class="versions-empty versions-empty-filter">
<i class="fas fa-info-circle"></i>
<p>${escapeHtml(message)}</p>
</div>
`;
}
function renderErrorState(container, message) {
const fallback = translate('modals.model.versions.error', {}, 'Failed to load versions.');
container.innerHTML = `
@@ -391,6 +513,8 @@ export function initVersionsTab({
record: null,
};
let displayMode = getDefaultDisplayMode();
let apiClient;
function ensureClient() {
@@ -414,55 +538,89 @@ export function initVersionsTab({
`;
}
function render(record) {
controller.record = record;
controller.hasLoaded = true;
function render(record) {
controller.record = record;
controller.hasLoaded = true;
if (!record || !Array.isArray(record.versions) || record.versions.length === 0) {
renderEmptyState(container);
return;
}
const latestLibraryVersionId = getLatestLibraryVersionId(record);
let dividerInserted = false;
const sortedVersions = [...record.versions].sort(
(a, b) => Number(b.versionId) - Number(a.versionId)
);
const rowsMarkup = sortedVersions
.map(version => {
const isNewer =
typeof latestLibraryVersionId === 'number' &&
version.versionId > latestLibraryVersionId;
let markup = '';
if (
!dividerInserted &&
typeof latestLibraryVersionId === 'number' &&
!isNewer
) {
dividerInserted = true;
markup += '<div class="version-divider" role="presentation"></div>';
}
markup += renderRow(version, {
latestLibraryVersionId,
currentVersionId: normalizedCurrentVersionId,
modelId: record?.modelId ?? modelId,
});
return markup;
})
.join('');
container.innerHTML = `
${renderToolbar(record)}
<div class="versions-list">
${rowsMarkup}
</div>
`;
setupMediaHoverInteractions(container);
if (!record || !Array.isArray(record.versions) || record.versions.length === 0) {
renderEmptyState(container);
return;
}
const latestLibraryVersionId = getLatestLibraryVersionId(record);
const { normalized: currentBaseModelNormalized, raw: currentBaseModelLabel } =
getCurrentVersionBaseModel(record, normalizedCurrentVersionId);
const isFilteringActive =
displayMode === DISPLAY_FILTER_MODES.SAME_BASE &&
Boolean(currentBaseModelNormalized);
const sortedVersions = [...record.versions].sort(
(a, b) => Number(b.versionId) - Number(a.versionId)
);
const filteredVersions = sortedVersions.filter(version => {
if (!isFilteringActive) {
return true;
}
return normalizeBaseModelName(version.baseModel) === currentBaseModelNormalized;
});
const dividerThresholdVersionId = (() => {
if (!isFilteringActive) {
return latestLibraryVersionId;
}
const baseLocalVersionIds = record.versions
.filter(
version =>
version.isInLibrary &&
normalizeBaseModelName(version.baseModel) === currentBaseModelNormalized &&
typeof version.versionId === 'number'
)
.map(version => version.versionId);
if (!baseLocalVersionIds.length) {
return null;
}
return Math.max(...baseLocalVersionIds);
})();
let dividerInserted = false;
const rowsMarkup = filteredVersions
.map(version => {
let markup = '';
if (
!dividerInserted &&
typeof dividerThresholdVersionId === 'number' &&
!(version.versionId > dividerThresholdVersionId)
) {
dividerInserted = true;
markup += '<div class="version-divider" role="presentation"></div>';
}
markup += renderRow(version, {
latestLibraryVersionId: dividerThresholdVersionId,
currentVersionId: normalizedCurrentVersionId,
modelId: record?.modelId ?? modelId,
});
return markup;
})
.join('');
const listContent =
rowsMarkup || renderFilteredEmptyState(currentBaseModelLabel);
container.innerHTML = `
${renderToolbar(record, {
displayMode,
isFilteringActive,
})}
<div class="versions-list">
${listContent}
</div>
`;
setupMediaHoverInteractions(container);
}
async function loadVersions({ forceRefresh = false, eager = false } = {}) {
if (controller.isLoading) {
return;
@@ -531,6 +689,17 @@ export function initVersionsTab({
}
}
function handleToggleVersionDisplayMode() {
displayMode =
displayMode === DISPLAY_FILTER_MODES.SAME_BASE
? DISPLAY_FILTER_MODES.ANY
: DISPLAY_FILTER_MODES.SAME_BASE;
if (!controller.record) {
return;
}
render(controller.record);
}
async function handleToggleVersionIgnore(button, versionId) {
if (!controller.record) {
return;
@@ -647,9 +816,8 @@ export function initVersionsTab({
const versionName =
version.name ||
translate('modals.model.versions.labels.unnamed', {}, 'Untitled Version');
const previewUrl =
version.previewUrl || '/loras_static/images/no-preview.png';
const metaMarkup = buildMetaMarkup(version);
const previewMarkup = renderDeletePreview(version, versionName);
const modalElement = modalRecord.element;
const originalMarkup = modalElement.innerHTML;
@@ -660,7 +828,7 @@ export function initVersionsTab({
<p class="delete-message">${escapeHtml(confirmMessage)}</p>
<div class="delete-model-info">
<div class="delete-preview">
<img src="${escapeHtml(previewUrl)}" alt="${escapeHtml(versionName)}" onerror="this.src='/loras_static/images/no-preview.png'">
${previewMarkup}
</div>
<div class="delete-info">
<h3>${escapeHtml(versionName)}</h3>
@@ -799,9 +967,17 @@ export function initVersionsTab({
const toolbarAction = event.target.closest('[data-versions-action]');
if (toolbarAction) {
const action = toolbarAction.dataset.versionsAction;
if (action === 'toggle-model-ignore') {
event.preventDefault();
await handleToggleModelIgnore(toolbarAction);
switch (action) {
case 'toggle-model-ignore':
event.preventDefault();
await handleToggleModelIgnore(toolbarAction);
break;
case 'toggle-version-display-mode':
event.preventDefault();
handleToggleVersionDisplayMode();
break;
default:
break;
}
return;
}

View File

@@ -499,7 +499,7 @@ function addNewTriggerWord(word) {
}
// Validation: Check length
if (word.split(/\s+/).length > 30) {
if (word.split(/\s+/).length > 100) {
showToast('toast.triggerWords.tooLong', {}, 'error');
return;
}

View File

@@ -15,6 +15,18 @@ import {
import { generateMetadataPanel } from './MetadataPanel.js';
import { generateImageWrapper, generateVideoWrapper } from './MediaRenderers.js';
export const showcaseListenerMetrics = {
wheelListeners: 0,
mutationObservers: 0,
backToTopHandlers: 0,
};
export function resetShowcaseListenerMetrics() {
showcaseListenerMetrics.wheelListeners = 0;
showcaseListenerMetrics.mutationObservers = 0;
showcaseListenerMetrics.backToTopHandlers = 0;
}
/**
* Load example images asynchronously
* @param {Array} images - Array of image objects (both regular and custom)
@@ -524,8 +536,8 @@ export function scrollToTop(button) {
* @param {string} modalId - ID of the modal element
*/
export function setupShowcaseScroll(modalId) {
// Listen for wheel events
document.addEventListener('wheel', (event) => {
const wheelOptions = { passive: false };
const wheelHandler = (event) => {
const modalContent = document.querySelector(`#${modalId} .modal-content`);
if (!modalContent) return;
@@ -543,7 +555,9 @@ export function setupShowcaseScroll(modalId) {
event.preventDefault();
}
}
}, { passive: false });
};
document.addEventListener('wheel', wheelHandler, wheelOptions);
showcaseListenerMetrics.wheelListeners += 1;
// Use MutationObserver to set up back-to-top button when modal content is added
const observer = new MutationObserver((mutations) => {
@@ -558,12 +572,28 @@ export function setupShowcaseScroll(modalId) {
});
observer.observe(document.body, { childList: true, subtree: true });
showcaseListenerMetrics.mutationObservers += 1;
// Try to set up the button immediately in case the modal is already open
const modalContent = document.querySelector(`#${modalId} .modal-content`);
if (modalContent) {
setupBackToTopButton(modalContent);
}
let cleanedUp = false;
return () => {
if (cleanedUp) {
return;
}
cleanedUp = true;
document.removeEventListener('wheel', wheelHandler, wheelOptions);
showcaseListenerMetrics.wheelListeners -= 1;
observer.disconnect();
showcaseListenerMetrics.mutationObservers -= 1;
const modalContent = document.querySelector(`#${modalId} .modal-content`);
teardownBackToTopButton(modalContent);
};
}
/**
@@ -571,11 +601,9 @@ export function setupShowcaseScroll(modalId) {
* @param {HTMLElement} modalContent - Modal content element
*/
function setupBackToTopButton(modalContent) {
// Remove any existing scroll listeners to avoid duplicates
modalContent.onscroll = null;
// Add new scroll listener
modalContent.addEventListener('scroll', () => {
teardownBackToTopButton(modalContent);
const handler = () => {
const backToTopBtn = modalContent.querySelector('.back-to-top');
if (backToTopBtn) {
if (modalContent.scrollTop > 300) {
@@ -584,8 +612,23 @@ function setupBackToTopButton(modalContent) {
backToTopBtn.classList.remove('visible');
}
}
});
// Trigger a scroll event to check initial position
modalContent.dispatchEvent(new Event('scroll'));
}
};
modalContent._backToTopScrollHandler = handler;
modalContent.addEventListener('scroll', handler);
showcaseListenerMetrics.backToTopHandlers += 1;
handler();
}
function teardownBackToTopButton(modalContent) {
if (!modalContent) {
return;
}
const existingHandler = modalContent._backToTopScrollHandler;
if (existingHandler) {
modalContent.removeEventListener('scroll', existingHandler);
delete modalContent._backToTopScrollHandler;
showcaseListenerMetrics.backToTopHandlers -= 1;
}
}

View File

@@ -2,6 +2,7 @@ import { getCurrentPageState } from '../state/index.js';
import { showToast, updatePanelPositions } from '../utils/uiHelpers.js';
import { getModelApiClient } from '../api/modelApiFactory.js';
import { removeStorageItem, setStorageItem, getStorageItem } from '../utils/storageHelpers.js';
import { MODEL_TYPE_DISPLAY_NAMES } from '../utils/constants.js';
export class FilterManager {
constructor(options = {}) {
@@ -12,10 +13,7 @@ export class FilterManager {
this.currentPage = options.page || document.body.dataset.page || 'loras';
const pageState = getCurrentPageState();
this.filters = pageState.filters || {
baseModel: [],
tags: []
};
this.filters = this.initializeFilters(pageState ? pageState.filters : undefined);
this.filterPanel = document.getElementById('filterPanel');
this.filterButton = document.getElementById('filterButton');
@@ -27,6 +25,7 @@ export class FilterManager {
// Store this instance in the state
if (pageState) {
pageState.filterManager = this;
pageState.filters = this.cloneFilters();
}
}
@@ -36,6 +35,15 @@ export class FilterManager {
this.createBaseModelTags();
}
if (document.getElementById('modelTypeTags')) {
this.createModelTypeTags();
}
// Add click handlers for license filter tags if supported on this page
if (this.shouldShowLicenseFilters()) {
this.initializeLicenseFilters();
}
// Add click handler for filter button
if (this.filterButton) {
this.filterButton.addEventListener('click', () => {
@@ -107,17 +115,12 @@ export class FilterManager {
tagEl.dataset.tag = tagName;
tagEl.innerHTML = `${tagName} <span class="tag-count">${tag.count}</span>`;
// Add click handler to toggle selection and automatically apply
// Add click handler to cycle through tri-state filter and automatically apply
tagEl.addEventListener('click', async () => {
tagEl.classList.toggle('active');
if (tagEl.classList.contains('active')) {
if (!this.filters.tags.includes(tagName)) {
this.filters.tags.push(tagName);
}
} else {
this.filters.tags = this.filters.tags.filter(t => t !== tagName);
}
const currentState = (this.filters.tags && this.filters.tags[tagName]) || 'none';
const newState = this.getNextTriStateState(currentState);
this.setTagFilterState(tagName, newState);
this.applyTagElementState(tagEl, newState);
this.updateActiveFiltersCount();
@@ -125,10 +128,90 @@ export class FilterManager {
await this.applyFilters(false);
});
this.applyTagElementState(tagEl, (this.filters.tags && this.filters.tags[tagName]) || 'none');
tagsContainer.appendChild(tagEl);
});
}
initializeLicenseFilters() {
const licenseTags = document.querySelectorAll('.license-tag');
licenseTags.forEach(tag => {
tag.addEventListener('click', async () => {
const licenseType = tag.dataset.license;
// Ensure license object exists
if (!this.filters.license) {
this.filters.license = {};
}
// Get current state
let currentState = this.filters.license[licenseType] || 'none'; // none, include, exclude
// Cycle through states: none -> include -> exclude -> none
let newState;
switch (currentState) {
case 'none':
newState = 'include';
tag.classList.remove('exclude');
tag.classList.add('active');
break;
case 'include':
newState = 'exclude';
tag.classList.remove('active');
tag.classList.add('exclude');
break;
case 'exclude':
newState = 'none';
tag.classList.remove('active', 'exclude');
break;
}
// Update filter state
if (newState === 'none') {
delete this.filters.license[licenseType];
// Clean up empty license object
if (Object.keys(this.filters.license).length === 0) {
delete this.filters.license;
}
} else {
this.filters.license[licenseType] = newState;
}
this.updateActiveFiltersCount();
// Auto-apply filter when tag is clicked
await this.applyFilters(false);
});
});
// Update selections based on stored filters
this.updateLicenseSelections();
}
updateLicenseSelections() {
const licenseTags = document.querySelectorAll('.license-tag');
licenseTags.forEach(tag => {
const licenseType = tag.dataset.license;
const state = (this.filters.license && this.filters.license[licenseType]) || 'none';
// Reset classes
tag.classList.remove('active', 'exclude');
// Apply appropriate class based on state
switch (state) {
case 'include':
tag.classList.add('active');
break;
case 'exclude':
tag.classList.add('exclude');
break;
default:
// none state - no classes needed
break;
}
});
}
createBaseModelTags() {
const baseModelTagsContainer = document.getElementById('baseModelTags');
if (!baseModelTagsContainer) return;
@@ -172,12 +255,86 @@ export class FilterManager {
// Update selections based on stored filters
this.updateTagSelections();
}
})
.catch(error => {
console.error(`Error fetching base models for ${this.currentPage}:`, error);
baseModelTagsContainer.innerHTML = '<div class="tags-error">Failed to load base models</div>';
});
}
async createModelTypeTags() {
const modelTypeContainer = document.getElementById('modelTypeTags');
if (!modelTypeContainer) return;
modelTypeContainer.innerHTML = '<div class="tags-loading">Loading model types...</div>';
try {
const response = await fetch(`/api/lm/${this.currentPage}/model-types?limit=20`);
if (!response.ok) {
throw new Error('Failed to fetch model types');
}
const data = await response.json();
if (!data.success || !Array.isArray(data.model_types)) {
throw new Error('Invalid response format');
}
const normalizedTypes = data.model_types
.map(entry => {
if (!entry || !entry.type) {
return null;
}
const typeKey = entry.type.toString().trim().toLowerCase();
if (!typeKey || !MODEL_TYPE_DISPLAY_NAMES[typeKey]) {
return null;
}
return {
type: typeKey,
count: Number(entry.count) || 0,
};
})
.filter(Boolean);
if (!normalizedTypes.length) {
modelTypeContainer.innerHTML = '<div class="no-tags">No model types available</div>';
return;
}
modelTypeContainer.innerHTML = '';
normalizedTypes.forEach(({ type, count }) => {
const tag = document.createElement('div');
tag.className = 'filter-tag model-type-tag';
tag.dataset.modelType = type;
tag.innerHTML = `${MODEL_TYPE_DISPLAY_NAMES[type]} <span class="tag-count">${count}</span>`;
if (this.filters.modelTypes.includes(type)) {
tag.classList.add('active');
}
})
.catch(error => {
console.error(`Error fetching base models for ${this.currentPage}:`, error);
baseModelTagsContainer.innerHTML = '<div class="tags-error">Failed to load base models</div>';
tag.addEventListener('click', async () => {
const isSelected = this.filters.modelTypes.includes(type);
if (isSelected) {
this.filters.modelTypes = this.filters.modelTypes.filter(value => value !== type);
tag.classList.remove('active');
} else {
this.filters.modelTypes.push(type);
tag.classList.add('active');
}
this.updateActiveFiltersCount();
await this.applyFilters(false);
});
modelTypeContainer.appendChild(tag);
});
this.updateModelTypeSelections();
} catch (error) {
console.error('Error loading model types:', error);
modelTypeContainer.innerHTML = '<div class="tags-error">Failed to load model types</div>';
}
}
toggleFilterPanel() {
@@ -227,7 +384,22 @@ export class FilterManager {
const modelTags = document.querySelectorAll('.tag-filter');
modelTags.forEach(tag => {
const tagName = tag.dataset.tag;
if (this.filters.tags.includes(tagName)) {
const state = (this.filters.tags && this.filters.tags[tagName]) || 'none';
this.applyTagElementState(tag, state);
});
// Update license tags if visible on this page
if (this.shouldShowLicenseFilters()) {
this.updateLicenseSelections();
}
this.updateModelTypeSelections();
}
updateModelTypeSelections() {
const typeTags = document.querySelectorAll('.model-type-tag');
typeTags.forEach(tag => {
const modelType = tag.dataset.modelType;
if (this.filters.modelTypes.includes(modelType)) {
tag.classList.add('active');
} else {
tag.classList.remove('active');
@@ -236,7 +408,10 @@ export class FilterManager {
}
updateActiveFiltersCount() {
const totalActiveFilters = this.filters.baseModel.length + this.filters.tags.length;
const tagFilterCount = this.filters.tags ? Object.keys(this.filters.tags).length : 0;
const licenseFilterCount = this.filters.license ? Object.keys(this.filters.license).length : 0;
const modelTypeFilterCount = this.filters.modelTypes.length;
const totalActiveFilters = this.filters.baseModel.length + tagFilterCount + licenseFilterCount + modelTypeFilterCount;
if (this.activeFiltersCount) {
if (totalActiveFilters > 0) {
@@ -253,10 +428,11 @@ export class FilterManager {
const storageKey = `${this.currentPage}_filters`;
// Save filters to localStorage
setStorageItem(storageKey, this.filters);
const filtersSnapshot = this.cloneFilters();
setStorageItem(storageKey, filtersSnapshot);
// Update state with current filters
pageState.filters = { ...this.filters };
pageState.filters = filtersSnapshot;
// Call the appropriate manager's load method based on page type
if (this.currentPage === 'recipes' && window.recipeManager) {
@@ -271,7 +447,7 @@ export class FilterManager {
this.filterButton.classList.add('active');
if (showToastNotification) {
const baseModelCount = this.filters.baseModel.length;
const tagsCount = this.filters.tags.length;
const tagsCount = this.filters.tags ? Object.keys(this.filters.tags).length : 0;
let message = '';
if (baseModelCount > 0 && tagsCount > 0) {
@@ -294,14 +470,17 @@ export class FilterManager {
async clearFilters() {
// Clear all filters
this.filters = {
this.filters = this.initializeFilters({
...this.filters,
baseModel: [],
tags: []
};
tags: {},
license: {},
modelTypes: []
});
// Update state
const pageState = getCurrentPageState();
pageState.filters = { ...this.filters };
pageState.filters = this.cloneFilters();
// Update UI
this.updateTagSelections();
@@ -335,14 +514,11 @@ export class FilterManager {
if (savedFilters) {
try {
// Ensure backward compatibility with older filter format
this.filters = {
baseModel: savedFilters.baseModel || [],
tags: savedFilters.tags || []
};
this.filters = this.initializeFilters(savedFilters);
// Update state with loaded filters
const pageState = getCurrentPageState();
pageState.filters = { ...this.filters };
pageState.filters = this.cloneFilters();
this.updateTagSelections();
this.updateActiveFiltersCount();
@@ -357,6 +533,143 @@ export class FilterManager {
}
hasActiveFilters() {
return this.filters.baseModel.length > 0 || this.filters.tags.length > 0;
const tagCount = this.filters.tags ? Object.keys(this.filters.tags).length : 0;
const licenseCount = this.filters.license ? Object.keys(this.filters.license).length : 0;
const modelTypeCount = this.filters.modelTypes.length;
return (
this.filters.baseModel.length > 0 ||
tagCount > 0 ||
licenseCount > 0 ||
modelTypeCount > 0
);
}
initializeFilters(existingFilters = {}) {
const source = existingFilters || {};
return {
...source,
baseModel: Array.isArray(source.baseModel) ? [...source.baseModel] : [],
tags: this.normalizeTagFilters(source.tags),
license: this.shouldShowLicenseFilters() ? this.normalizeLicenseFilters(source.license) : {},
modelTypes: this.normalizeModelTypeFilters(source.modelTypes)
};
}
shouldShowLicenseFilters() {
return this.currentPage !== 'recipes';
}
normalizeTagFilters(tagFilters) {
if (!tagFilters) {
return {};
}
if (Array.isArray(tagFilters)) {
return tagFilters.reduce((acc, tag) => {
if (typeof tag === 'string' && tag.trim().length > 0) {
acc[tag] = 'include';
}
return acc;
}, {});
}
if (typeof tagFilters === 'object') {
const normalized = {};
Object.entries(tagFilters).forEach(([tag, state]) => {
if (!tag) {
return;
}
const normalizedState = typeof state === 'string' ? state.toLowerCase() : '';
if (normalizedState === 'include' || normalizedState === 'exclude') {
normalized[tag] = normalizedState;
}
});
return normalized;
}
return {};
}
normalizeLicenseFilters(licenseFilters) {
if (!licenseFilters || typeof licenseFilters !== 'object') {
return {};
}
const normalized = {};
Object.entries(licenseFilters).forEach(([key, state]) => {
const normalizedState = typeof state === 'string' ? state.toLowerCase() : '';
if (normalizedState === 'include' || normalizedState === 'exclude') {
normalized[key] = normalizedState;
}
});
return normalized;
}
normalizeModelTypeFilters(modelTypes) {
if (!Array.isArray(modelTypes)) {
return [];
}
const seen = new Set();
return modelTypes.reduce((acc, type) => {
if (typeof type !== 'string') {
return acc;
}
const normalized = type.trim().toLowerCase();
if (!normalized || seen.has(normalized)) {
return acc;
}
seen.add(normalized);
acc.push(normalized);
return acc;
}, []);
}
cloneFilters() {
return {
...this.filters,
baseModel: [...(this.filters.baseModel || [])],
tags: { ...(this.filters.tags || {}) },
license: { ...(this.filters.license || {}) },
modelTypes: [...(this.filters.modelTypes || [])]
};
}
getNextTriStateState(currentState) {
switch (currentState) {
case 'none':
return 'include';
case 'include':
return 'exclude';
default:
return 'none';
}
}
setTagFilterState(tagName, state) {
if (!this.filters.tags) {
this.filters.tags = {};
}
if (state === 'none') {
delete this.filters.tags[tagName];
} else {
this.filters.tags[tagName] = state;
}
}
applyTagElementState(element, state) {
if (!element) {
return;
}
element.classList.remove('active', 'exclude');
if (state === 'include') {
element.classList.add('active');
} else if (state === 'exclude') {
element.classList.add('exclude');
}
}
}

View File

@@ -131,11 +131,36 @@ export class SettingsManager {
}
merged.priority_tags = normalizedPriority;
merged.auto_organize_exclusions = this.normalizePatternList(
backendSettings?.auto_organize_exclusions ?? defaults.auto_organize_exclusions
);
Object.keys(merged).forEach(key => this.backendSettingKeys.add(key));
return merged;
}
normalizePatternList(value) {
if (Array.isArray(value)) {
const sanitized = value
.map(item => typeof item === 'string' ? item.trim() : '')
.filter(Boolean);
return [...new Set(sanitized)];
}
if (typeof value === 'string') {
const sanitized = value
.replace(/\n/g, ',')
.replace(/;/g, ',')
.split(',')
.map(part => part.trim())
.filter(Boolean);
return [...new Set(sanitized)];
}
return [];
}
registerStartupMessages(messages = []) {
if (!Array.isArray(messages) || messages.length === 0) {
return;
@@ -324,6 +349,16 @@ export class SettingsManager {
}
});
const autoOrganizeInput = document.getElementById('autoOrganizeExclusions');
if (autoOrganizeInput) {
autoOrganizeInput.addEventListener('keydown', (event) => {
if (event.key === 'Enter' && !event.shiftKey) {
event.preventDefault();
this.saveAutoOrganizeExclusions();
}
});
}
this.setupPriorityTagInputs();
this.initialized = true;
@@ -371,6 +406,21 @@ export class SettingsManager {
showOnlySFWCheckbox.checked = state.global.settings.show_only_sfw ?? false;
}
const usePortableCheckbox = document.getElementById('usePortableSettings');
if (usePortableCheckbox) {
usePortableCheckbox.checked = !!state.global.settings.use_portable_settings;
}
const autoOrganizeExclusionsInput = document.getElementById('autoOrganizeExclusions');
if (autoOrganizeExclusionsInput) {
const patterns = this.normalizePatternList(state.global.settings.auto_organize_exclusions);
autoOrganizeExclusionsInput.value = patterns.join(', ');
}
const autoOrganizeExclusionsError = document.getElementById('autoOrganizeExclusionsError');
if (autoOrganizeExclusionsError) {
autoOrganizeExclusionsError.textContent = '';
}
// Set video autoplay on hover setting
const autoplayOnHoverCheckbox = document.getElementById('autoplayOnHover');
if (autoplayOnHoverCheckbox) {
@@ -407,6 +457,11 @@ export class SettingsManager {
modelNameDisplaySelect.value = state.global.settings.model_name_display || 'model_name';
}
const updateFlagStrategySelect = document.getElementById('updateFlagStrategy');
if (updateFlagStrategySelect) {
updateFlagStrategySelect.value = state.global.settings.update_flag_strategy || 'same_base';
}
// Set optimize example images setting
const optimizeExampleImagesCheckbox = document.getElementById('optimizeExampleImages');
if (optimizeExampleImagesCheckbox) {
@@ -1329,11 +1384,7 @@ export class SettingsManager {
showToast('toast.settings.settingsUpdated', { setting: settingKey.replace(/_/g, ' ') }, 'success');
if (settingKey === 'model_name_display') {
this.reloadContent();
}
if (settingKey === 'model_card_footer_action') {
if (settingKey === 'model_name_display' || settingKey === 'model_card_footer_action' || settingKey === 'update_flag_strategy') {
this.reloadContent();
}
} catch (error) {
@@ -1586,11 +1637,63 @@ export class SettingsManager {
}
}
}
async saveAutoOrganizeExclusions() {
const input = document.getElementById('autoOrganizeExclusions');
const errorElement = document.getElementById('autoOrganizeExclusionsError');
if (!input) return;
const normalized = this.normalizePatternList(input.value);
if (input.value.trim() && normalized.length === 0) {
if (errorElement) {
errorElement.textContent = translate(
'settings.autoOrganizeExclusions.validation.noPatterns',
{},
'Enter at least one pattern separated by commas or semicolons.'
);
}
return;
}
const current = this.normalizePatternList(state.global.settings.auto_organize_exclusions);
if (normalized.join('|') === current.join('|')) {
if (errorElement) {
errorElement.textContent = '';
}
return;
}
try {
if (errorElement) {
errorElement.textContent = '';
}
await this.saveSetting('auto_organize_exclusions', normalized);
input.value = normalized.join(', ');
showToast(
'toast.settings.settingsUpdated',
{ setting: translate('settings.autoOrganizeExclusions.label') },
'success'
);
} catch (error) {
console.error('Failed to save auto-organize exclusions:', error);
if (errorElement) {
errorElement.textContent = translate(
'settings.autoOrganizeExclusions.validation.saveFailed',
{ message: error.message },
`Unable to save exclusions: ${error.message}`
);
}
showToast('toast.settings.settingSaveFailed', { message: error.message }, 'error');
}
}
async saveInputSetting(elementId, settingKey) {
const element = document.getElementById(elementId);
if (!element) return;
const value = element.value.trim(); // Trim whitespace
try {

View File

@@ -56,6 +56,15 @@ export class DownloadManager {
gen_params: this.importManager.recipeData.gen_params || {},
raw_metadata: this.importManager.recipeData.raw_metadata || {}
};
const checkpointMetadata =
this.importManager.recipeData.checkpoint ||
this.importManager.recipeData.model ||
(this.importManager.recipeData.gen_params || {}).checkpoint;
if (checkpointMetadata && typeof checkpointMetadata === 'object') {
completeMetadata.checkpoint = checkpointMetadata;
}
// Add source_path to metadata to track where the recipe was imported from
if (this.importManager.importMode === 'url') {

View File

@@ -76,20 +76,24 @@ export class ImageProcessor {
}
// Get recipe data from response
this.importManager.recipeData = await response.json();
const recipeData = await response.json();
if (!recipeData) {
throw new Error('No recipe data returned from image analysis');
}
this.importManager.recipeData = recipeData;
this._ensureCheckpointMetadata();
// Check if we have an error message
if (this.importManager.recipeData.error) {
throw new Error(this.importManager.recipeData.error);
}
// Check if we have valid recipe data
if (!this.importManager.recipeData ||
!this.importManager.recipeData.loras ||
this.importManager.recipeData.loras.length === 0) {
throw new Error('No LoRA information found in this image');
}
this.importManager.recipeData.loras = Array.isArray(this.importManager.recipeData.loras)
? this.importManager.recipeData.loras
: [];
// Find missing LoRAs
this.importManager.missingLoras = this.importManager.recipeData.loras.filter(
lora => !lora.existsLocally
@@ -124,20 +128,24 @@ export class ImageProcessor {
}
// Get recipe data from response
this.importManager.recipeData = await response.json();
const recipeData = await response.json();
if (!recipeData) {
throw new Error('No recipe data returned from image analysis');
}
this.importManager.recipeData = recipeData;
this._ensureCheckpointMetadata();
// Check if we have an error message
if (this.importManager.recipeData.error) {
throw new Error(this.importManager.recipeData.error);
}
// Check if we have valid recipe data
if (!this.importManager.recipeData ||
!this.importManager.recipeData.loras ||
this.importManager.recipeData.loras.length === 0) {
throw new Error('No LoRA information found in this image');
}
this.importManager.recipeData.loras = Array.isArray(this.importManager.recipeData.loras)
? this.importManager.recipeData.loras
: [];
// Find missing LoRAs
this.importManager.missingLoras = this.importManager.recipeData.loras.filter(
lora => !lora.existsLocally
@@ -175,19 +183,23 @@ export class ImageProcessor {
});
// Get recipe data from response
this.importManager.recipeData = await response.json();
const recipeData = await response.json();
if (!recipeData) {
throw new Error('No recipe data returned from image analysis');
}
this.importManager.recipeData = recipeData;
this._ensureCheckpointMetadata();
// Check if we have an error message
if (this.importManager.recipeData.error) {
throw new Error(this.importManager.recipeData.error);
}
// Check if we have valid recipe data
if (!this.importManager.recipeData ||
!this.importManager.recipeData.loras ||
this.importManager.recipeData.loras.length === 0) {
throw new Error('No LoRA information found in this image');
}
this.importManager.recipeData.loras = Array.isArray(this.importManager.recipeData.loras)
? this.importManager.recipeData.loras
: [];
// Find missing LoRAs
this.importManager.missingLoras = this.importManager.recipeData.loras.filter(
@@ -206,4 +218,12 @@ export class ImageProcessor {
this.importManager.loadingManager.hide();
}
}
_ensureCheckpointMetadata() {
if (!this.importManager.recipeData) return;
if (this.importManager.recipeData.model && !this.importManager.recipeData.checkpoint) {
this.importManager.recipeData.checkpoint = this.importManager.recipeData.model;
}
}
}

View File

@@ -5,6 +5,7 @@ import { DEFAULT_PATH_TEMPLATES, DEFAULT_PRIORITY_TAG_CONFIG } from '../utils/co
const DEFAULT_SETTINGS_BASE = Object.freeze({
civitai_api_key: '',
use_portable_settings: false,
language: 'en',
show_only_sfw: false,
enable_metadata_archive_db: false,
@@ -32,6 +33,8 @@ const DEFAULT_SETTINGS_BASE = Object.freeze({
include_trigger_words: false,
compact_mode: false,
priority_tags: { ...DEFAULT_PRIORITY_TAG_CONFIG },
update_flag_strategy: 'same_base',
auto_organize_exclusions: [],
});
export function createDefaultSettings() {
@@ -66,18 +69,20 @@ export const state = {
activeFolder: getStorageItem(`${MODEL_TYPES.LORA}_activeFolder`),
activeLetterFilter: null,
previewVersions: loraPreviewVersions,
searchManager: null,
searchOptions: {
filename: true,
modelname: true,
tags: false,
creator: false,
recursive: getStorageItem(`${MODEL_TYPES.LORA}_recursiveSearch`, true),
},
filters: {
baseModel: [],
tags: []
},
searchManager: null,
searchOptions: {
filename: true,
modelname: true,
tags: false,
creator: false,
recursive: getStorageItem(`${MODEL_TYPES.LORA}_recursiveSearch`, true),
},
filters: {
baseModel: [],
tags: {},
license: {},
modelTypes: []
},
bulkMode: false,
selectedLoras: new Set(),
loraMetadataCache: new Map(),
@@ -91,18 +96,20 @@ export const state = {
isLoading: false,
hasMore: true,
sortBy: 'date',
searchManager: null,
searchOptions: {
title: true,
tags: true,
loraName: true,
loraModel: true
},
filters: {
baseModel: [],
tags: [],
search: ''
},
searchManager: null,
searchOptions: {
title: true,
tags: true,
loraName: true,
loraModel: true
},
filters: {
baseModel: [],
tags: {},
license: {},
modelTypes: [],
search: ''
},
pageSize: 20,
showFavoritesOnly: false,
duplicatesMode: false,
@@ -117,17 +124,19 @@ export const state = {
sortBy: 'name',
activeFolder: getStorageItem(`${MODEL_TYPES.CHECKPOINT}_activeFolder`),
previewVersions: checkpointPreviewVersions,
searchManager: null,
searchOptions: {
filename: true,
modelname: true,
creator: false,
recursive: getStorageItem(`${MODEL_TYPES.CHECKPOINT}_recursiveSearch`, true),
},
filters: {
baseModel: [],
tags: []
},
searchManager: null,
searchOptions: {
filename: true,
modelname: true,
creator: false,
recursive: getStorageItem(`${MODEL_TYPES.CHECKPOINT}_recursiveSearch`, true),
},
filters: {
baseModel: [],
tags: {},
license: {},
modelTypes: []
},
modelType: 'checkpoint', // 'checkpoint' or 'diffusion_model'
bulkMode: false,
selectedModels: new Set(),
@@ -145,18 +154,20 @@ export const state = {
activeFolder: getStorageItem(`${MODEL_TYPES.EMBEDDING}_activeFolder`),
activeLetterFilter: null,
previewVersions: embeddingPreviewVersions,
searchManager: null,
searchOptions: {
filename: true,
modelname: true,
tags: false,
creator: false,
recursive: getStorageItem(`${MODEL_TYPES.EMBEDDING}_recursiveSearch`, true),
},
filters: {
baseModel: [],
tags: []
},
searchManager: null,
searchOptions: {
filename: true,
modelname: true,
tags: false,
creator: false,
recursive: getStorageItem(`${MODEL_TYPES.EMBEDDING}_recursiveSearch`, true),
},
filters: {
baseModel: [],
tags: {},
license: {},
modelTypes: []
},
bulkMode: false,
selectedModels: new Set(),
metadataCache: new Map(),

View File

@@ -26,6 +26,7 @@ export const BASE_MODELS = {
FLUX_1_S: "Flux.1 S",
FLUX_1_KREA: "Flux.1 Krea",
FLUX_1_KONTEXT: "Flux.1 Kontext",
FLUX_2_D: "Flux.2 D",
AURAFLOW: "AuraFlow",
CHROMA: "Chroma",
PIXART_A: "PixArt a",
@@ -38,6 +39,7 @@ export const BASE_MODELS = {
PONY: "Pony",
HIDREAM: "HiDream",
QWEN: "Qwen",
ZIMAGE_TURBO: "ZImageTurbo",
// Video models
SVD: "SVD",
@@ -55,6 +57,12 @@ export const BASE_MODELS = {
UNKNOWN: "Other"
};
export const MODEL_TYPE_DISPLAY_NAMES = {
lora: "LoRA",
locon: "LyCORIS",
dora: "DoRA",
};
export const BASE_MODEL_ABBREVIATIONS = {
// Stable Diffusion 1.x models
[BASE_MODELS.SD_1_4]: 'SD1',
@@ -83,6 +91,7 @@ export const BASE_MODEL_ABBREVIATIONS = {
[BASE_MODELS.FLUX_1_S]: 'F1S',
[BASE_MODELS.FLUX_1_KREA]: 'F1KR',
[BASE_MODELS.FLUX_1_KONTEXT]: 'F1KX',
[BASE_MODELS.FLUX_2_D]: 'F2D',
// Other diffusion models
[BASE_MODELS.AURAFLOW]: 'AF',
@@ -97,6 +106,7 @@ export const BASE_MODEL_ABBREVIATIONS = {
[BASE_MODELS.PONY]: 'PONY',
[BASE_MODELS.HIDREAM]: 'HID',
[BASE_MODELS.QWEN]: 'QWEN',
[BASE_MODELS.ZIMAGE_TURBO]: 'ZIT',
// Video models
[BASE_MODELS.SVD]: 'SVD',
@@ -296,10 +306,10 @@ export const BASE_MODEL_CATEGORIES = {
BASE_MODELS.WAN_VIDEO_2_2_TI2V_5B, BASE_MODELS.WAN_VIDEO_2_2_T2V_A14B,
BASE_MODELS.WAN_VIDEO_2_2_I2V_A14B
],
'Flux Models': [BASE_MODELS.FLUX_1_D, BASE_MODELS.FLUX_1_S, BASE_MODELS.FLUX_1_KONTEXT, BASE_MODELS.FLUX_1_KREA],
'Flux Models': [BASE_MODELS.FLUX_1_D, BASE_MODELS.FLUX_1_S, BASE_MODELS.FLUX_1_KONTEXT, BASE_MODELS.FLUX_1_KREA, BASE_MODELS.FLUX_2_D],
'Other Models': [
BASE_MODELS.ILLUSTRIOUS, BASE_MODELS.PONY, BASE_MODELS.HIDREAM,
BASE_MODELS.QWEN, BASE_MODELS.AURAFLOW, BASE_MODELS.CHROMA,
BASE_MODELS.QWEN, BASE_MODELS.AURAFLOW, BASE_MODELS.CHROMA, BASE_MODELS.ZIMAGE_TURBO,
BASE_MODELS.PIXART_A, BASE_MODELS.PIXART_E, BASE_MODELS.HUNYUAN_1,
BASE_MODELS.LUMINA, BASE_MODELS.KOLORS, BASE_MODELS.NOOBAI,
BASE_MODELS.UNKNOWN

View File

@@ -8,6 +8,9 @@
<div class="context-menu-item" data-action="refresh-metadata">
<i class="fas fa-sync"></i> <span>{{ t('loras.contextMenu.refreshMetadata') }}</span>
</div>
<div class="context-menu-item" data-action="check-updates">
<i class="fas fa-bell"></i> <span>{{ t('loras.contextMenu.checkUpdates') }}</span>
</div>
<div class="context-menu-item" data-action="relink-civitai">
<i class="fas fa-link"></i> <span>{{ t('loras.contextMenu.relinkCivitai') }}</span>
</div>
@@ -93,6 +96,9 @@
<div class="context-menu-item" data-action="check-model-updates">
<i class="fas fa-sync-alt"></i> <span>{{ t('globalContextMenu.checkModelUpdates.label') }}</span>
</div>
<div class="context-menu-item" data-action="fetch-missing-licenses">
<i class="fas fa-shield-alt"></i> <span>{{ t('globalContextMenu.fetchMissingLicenses.label') }}</span>
</div>
<div class="context-menu-item" data-action="cleanup-example-images-folders">
<i class="fas fa-trash-restore"></i> <span>{{ t('globalContextMenu.cleanupExampleImages.label') }}</span>
</div>

View File

@@ -139,10 +139,30 @@
<div class="tags-loading">{{ t('common.status.loading') }}</div>
</div>
</div>
{% if current_page == 'loras' %}
<div class="filter-section">
<h4>{{ t('header.filter.modelTypes') }}</h4>
<div class="filter-tags" id="modelTypeTags">
<div class="tags-loading">{{ t('common.status.loading') }}</div>
</div>
</div>
{% endif %}
{% if current_page != 'recipes' %}
<div class="filter-section">
<h4>{{ t('header.filter.license') }}</h4>
<div class="filter-tags">
<div class="filter-tag license-tag" data-license="noCredit">
{{ t('header.filter.noCreditRequired') }}
</div>
<div class="filter-tag license-tag" data-license="allowSelling">
{{ t('header.filter.allowSellingGeneratedContent') }}
</div>
</div>
</div>
{% endif %}
<div class="filter-actions">
<button class="clear-filters-btn" onclick="filterManager.clearFilters()">
{{ t('header.filter.clearAll') }}
</button>
</div>
</div>

View File

@@ -38,6 +38,27 @@
</div>
</div>
<div class="settings-section">
<h3>{{ t('settings.sections.storageLocation') }}</h3>
<div class="setting-item">
<div class="setting-row">
<div class="setting-info">
<label for="usePortableSettings">{{ t('settings.storage.locationLabel') }}</label>
</div>
<div class="setting-control">
<label class="toggle-switch">
<input type="checkbox" id="usePortableSettings" {% if settings.get('use_portable_settings', False) %}checked{% endif %}
onchange="settingsManager.saveToggleSetting('usePortableSettings', 'use_portable_settings')">
<span class="toggle-slider"></span>
</label>
</div>
</div>
<div class="input-help">
{{ t('settings.storage.locationHelp') }}
</div>
</div>
</div>
<div class="settings-section">
<h3>{{ t('settings.sections.contentFiltering') }}</h3>
@@ -299,6 +320,27 @@
</div>
</div>
<!-- Update Flag Strategy Section -->
<div class="settings-section">
<h3>{{ t('settings.sections.updateFlags') }}</h3>
<div class="setting-item">
<div class="setting-row">
<div class="setting-info">
<label for="updateFlagStrategy">{{ t('settings.updateFlagStrategy.label') }}</label>
</div>
<div class="setting-control select-control">
<select id="updateFlagStrategy" onchange="settingsManager.saveSelectSetting('updateFlagStrategy', 'update_flag_strategy')">
<option value="same_base">{{ t('settings.updateFlagStrategy.options.sameBase') }}</option>
<option value="any">{{ t('settings.updateFlagStrategy.options.any') }}</option>
</select>
</div>
</div>
<div class="input-help">
{{ t('settings.updateFlagStrategy.help') }}
</div>
</div>
</div>
<!-- Default Path Customization Section -->
<div class="settings-section">
<h3>{{ t('settings.downloadPathTemplates.title') }}</h3>
@@ -459,6 +501,25 @@
</div>
</div>
<div class="setting-item priority-tags-item auto-organize-exclusions-item">
<div class="setting-row priority-tags-header">
<div class="setting-info priority-tags-info">
<label>{{ t('settings.autoOrganizeExclusions.label') }}</label>
</div>
</div>
<div class="input-help">
{{ t('settings.autoOrganizeExclusions.help') }}
</div>
<textarea
id="autoOrganizeExclusions"
class="priority-tags-input auto-organize-exclusions-input"
rows="3"
placeholder="{{ t('settings.autoOrganizeExclusions.placeholder') }}"
onblur="settingsManager.saveAutoOrganizeExclusions()"
></textarea>
<div class="settings-input-error-message" id="autoOrganizeExclusionsError"></div>
</div>
<!-- Add Example Images Settings Section -->
<div class="settings-section">
<h3>{{ t('settings.sections.exampleImages') }}</h3>

View File

@@ -57,18 +57,22 @@
<div class="info-section recipe-bottom-section">
<div class="recipe-section-header">
<h3>Resources</h3>
<div class="recipe-section-actions">
<span id="recipeLorasCount"><i class="fas fa-layer-group"></i> 0 LoRAs</span>
<button class="action-btn view-loras-btn" id="viewRecipeLorasBtn" title="View all LoRAs in this recipe">
<i class="fas fa-external-link-alt"></i>
</button>
<button class="copy-btn" id="copyRecipeSyntaxBtn" title="Copy Recipe Syntax">
<i class="fas fa-copy"></i>
</button>
</div>
<div class="recipe-section-actions">
<span id="recipeLorasCount"><i class="fas fa-layer-group"></i> 0 LoRAs</span>
<button class="action-btn view-loras-btn" id="viewRecipeLorasBtn" title="View all LoRAs in this recipe">
<i class="fas fa-external-link-alt"></i>
</button>
<button class="copy-btn" id="copyRecipeSyntaxBtn" title="Copy Recipe Syntax">
<i class="fas fa-copy"></i>
</button>
</div>
</div>
<div class="recipe-resources-list">
<div class="recipe-checkpoint-container" id="recipeCheckpoint"></div>
<div class="version-divider" id="recipeResourceDivider" style="display: none;"></div>
<div class="recipe-loras-list" id="recipeLorasList"></div>
</div>
</div>
</div>
</div>
</div>

Some files were not shown because too many files have changed in this diff Show More