Compare commits

..

14 Commits

Author SHA1 Message Date
Will Miao
0817901bef feat: update README and pyproject.toml for v0.8.10 release; add standalone mode and portable edition features 2025-04-28 18:24:02 +08:00
Will Miao
ac22172e53 Update requirements for standalone mode 2025-04-28 15:14:11 +08:00
Will Miao
fd87fbf31e Update workflow 2025-04-28 07:08:35 +08:00
Will Miao
554be0908f feat: add dynamic filename format patterns for Save Image Node in README 2025-04-28 07:01:33 +08:00
Will Miao
eaec4e5f13 feat: update README and settings.json.example for standalone mode; enhance standalone.py to redirect status requests to loras page 2025-04-27 09:41:33 +08:00
Will Miao
0e7ba27a7d feat: enhance Civitai resource extraction in StandardMetadataParser for improved JSON handling. Fixes https://github.com/willmiao/ComfyUI-Lora-Manager/issues/141 2025-04-26 22:12:40 +08:00
Will Miao
c551f5c23b feat: update README with standalone mode instructions and add settings.json.example file 2025-04-26 20:39:24 +08:00
pixelpaws
5159657ae5 Merge pull request #142 from willmiao/dev
Dev
2025-04-26 20:25:26 +08:00
Will Miao
d35db7df72 feat: add standalone mode for LoRA Manager with setup instructions 2025-04-26 20:23:27 +08:00
Will Miao
2b5399c559 feat: enhance folder path retrieval for diffusion models and improve warning messages 2025-04-26 20:08:00 +08:00
Will Miao
9e61bbbd8e feat: improve warning management by removing existing deleted LoRAs and early access warnings 2025-04-26 19:46:48 +08:00
Will Miao
7ce5857cd5 feat: implement standalone mode support with mock modules and path handling 2025-04-26 19:14:38 +08:00
Will Miao
38fbae99fd feat: limit maximum height of loras widget to accommodate up to 5 entries. Fixes https://github.com/willmiao/ComfyUI-Lora-Manager/issues/109 2025-04-26 12:00:36 +08:00
Will Miao
b0a9d44b0c Add support for SamplerCustomAdvanced node in metadata extraction 2025-04-26 09:40:44 +08:00
17 changed files with 806 additions and 120 deletions

View File

@@ -20,6 +20,12 @@ Watch this quick tutorial to learn how to use the new one-click LoRA integration
## Release Notes ## Release Notes
### v0.8.10
* **Standalone Mode** - Run LoRA Manager independently from ComfyUI for a lightweight experience that works even with other stable diffusion interfaces
* **Portable Edition** - New one-click portable version for easy startup and updates in standalone mode
* **Enhanced Metadata Collection** - Added support for SamplerCustomAdvanced node in the metadata collector module
* **Improved UI Organization** - Optimized Lora Loader node height to display up to 5 LoRAs at once with scrolling capability for larger collections
### v0.8.9 ### v0.8.9
* **Favorites System** - New functionality to bookmark your favorite LoRAs and checkpoints for quick access and better organization * **Favorites System** - New functionality to bookmark your favorite LoRAs and checkpoints for quick access and better organization
* **Enhanced UI Controls** - Increased model card button sizes for improved usability and easier interaction * **Enhanced UI Controls** - Increased model card button sizes for improved usability and easier interaction
@@ -173,6 +179,68 @@ pip install -r requirements.txt
- Paste into the Lora Loader node's text input - Paste into the Lora Loader node's text input
- The node will automatically apply preset strength and trigger words - The node will automatically apply preset strength and trigger words
### Filename Format Patterns for Save Image Node
The Save Image Node supports dynamic filename generation using pattern codes. You can customize how your images are named using the following format patterns:
#### Available Pattern Codes
- `%seed%` - Inserts the generation seed number
- `%width%` - Inserts the image width
- `%height%` - Inserts the image height
- `%pprompt:N%` - Inserts the positive prompt (limited to N characters)
- `%nprompt:N%` - Inserts the negative prompt (limited to N characters)
- `%model:N%` - Inserts the model/checkpoint name (limited to N characters)
- `%date%` - Inserts current date/time as "yyyyMMddhhmmss"
- `%date:FORMAT%` - Inserts date using custom format with:
- `yyyy` - 4-digit year
- `yy` - 2-digit year
- `MM` - 2-digit month
- `dd` - 2-digit day
- `hh` - 2-digit hour
- `mm` - 2-digit minute
- `ss` - 2-digit second
#### Examples
- `image_%seed%``image_1234567890`
- `gen_%width%x%height%``gen_512x768`
- `%model:10%_%seed%``dreamshape_1234567890`
- `%date:yyyy-MM-dd%``2025-04-28`
- `%pprompt:20%_%seed%``beautiful landscape_1234567890`
- `%model%_%date:yyMMdd%_%seed%``dreamshaper_v8_250428_1234567890`
You can combine multiple patterns to create detailed, organized filenames for your generated images.
### Standalone Mode
You can now run LoRA Manager independently from ComfyUI:
1. **For ComfyUI users**:
- Launch ComfyUI with LoRA Manager at least once to initialize the necessary path information in the `settings.json` file.
- Make sure dependencies are installed: `pip install -r requirements.txt`
- From your ComfyUI root directory, run:
```bash
python custom_nodes\comfyui-lora-manager\standalone.py
```
- Access the interface at: `http://localhost:8188/loras`
- You can specify a different host or port with arguments:
```bash
python custom_nodes\comfyui-lora-manager\standalone.py --host 127.0.0.1 --port 9000
```
2. **For non-ComfyUI users**:
- Copy the provided `settings.json.example` file to create a new file named `settings.json`
- Edit `settings.json` to include your correct model folder paths and CivitAI API key
- Install required dependencies: `pip install -r requirements.txt`
- Run standalone mode:
```bash
python standalone.py
```
- Access the interface through your browser at: `http://localhost:8188/loras`
This standalone mode provides a lightweight option for managing your model and recipe collection without needing to run the full ComfyUI environment, making it useful even for users who primarily use other stable diffusion interfaces.
--- ---
## Contributing ## Contributing
@@ -209,3 +277,4 @@ Join our Discord community for support, discussions, and updates:
[Discord Server](https://discord.gg/vcqNrWVFvM) [Discord Server](https://discord.gg/vcqNrWVFvM)
--- ---
````

View File

@@ -3,6 +3,11 @@ import platform
import folder_paths # type: ignore import folder_paths # type: ignore
from typing import List from typing import List
import logging import logging
import sys
import json
# Check if running in standalone mode
standalone_mode = 'nodes' not in sys.modules
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -18,9 +23,46 @@ class Config:
self._route_mappings = {} self._route_mappings = {}
self.loras_roots = self._init_lora_paths() self.loras_roots = self._init_lora_paths()
self.checkpoints_roots = self._init_checkpoint_paths() self.checkpoints_roots = self._init_checkpoint_paths()
self.temp_directory = folder_paths.get_temp_directory()
# 在初始化时扫描符号链接 # 在初始化时扫描符号链接
self._scan_symbolic_links() self._scan_symbolic_links()
if not standalone_mode:
# Save the paths to settings.json when running in ComfyUI mode
self.save_folder_paths_to_settings()
def save_folder_paths_to_settings(self):
"""Save folder paths to settings.json for standalone mode to use later"""
try:
# Check if we're running in ComfyUI mode (not standalone)
if hasattr(folder_paths, "get_folder_paths") and not isinstance(folder_paths, type):
# Get all relevant paths
lora_paths = folder_paths.get_folder_paths("loras")
checkpoint_paths = folder_paths.get_folder_paths("checkpoints")
diffuser_paths = folder_paths.get_folder_paths("diffusers")
unet_paths = folder_paths.get_folder_paths("unet")
# Load existing settings
settings_path = os.path.join(os.path.dirname(os.path.dirname(__file__)), 'settings.json')
settings = {}
if os.path.exists(settings_path):
with open(settings_path, 'r', encoding='utf-8') as f:
settings = json.load(f)
# Update settings with paths
settings['folder_paths'] = {
'loras': lora_paths,
'checkpoints': checkpoint_paths,
'diffusers': diffuser_paths,
'unet': unet_paths
}
# Save settings
with open(settings_path, 'w', encoding='utf-8') as f:
json.dump(settings, f, indent=2)
logger.info("Saved folder paths to settings.json")
except Exception as e:
logger.warning(f"Failed to save folder paths: {e}")
def _is_link(self, path: str) -> bool: def _is_link(self, path: str) -> bool:
try: try:
@@ -103,58 +145,66 @@ class Config:
def _init_lora_paths(self) -> List[str]: def _init_lora_paths(self) -> List[str]:
"""Initialize and validate LoRA paths from ComfyUI settings""" """Initialize and validate LoRA paths from ComfyUI settings"""
raw_paths = folder_paths.get_folder_paths("loras") try:
raw_paths = folder_paths.get_folder_paths("loras")
# Normalize and resolve symlinks, store mapping from resolved -> original
path_map = {} # Normalize and resolve symlinks, store mapping from resolved -> original
for path in raw_paths: path_map = {}
if os.path.exists(path): for path in raw_paths:
real_path = os.path.normpath(os.path.realpath(path)).replace(os.sep, '/') if os.path.exists(path):
path_map[real_path] = path_map.get(real_path, path.replace(os.sep, "/")) # preserve first seen real_path = os.path.normpath(os.path.realpath(path)).replace(os.sep, '/')
path_map[real_path] = path_map.get(real_path, path.replace(os.sep, "/")) # preserve first seen
# Now sort and use only the deduplicated real paths
unique_paths = sorted(path_map.values(), key=lambda p: p.lower()) # Now sort and use only the deduplicated real paths
print("Found LoRA roots:", "\n - " + "\n - ".join(unique_paths)) unique_paths = sorted(path_map.values(), key=lambda p: p.lower())
logger.info("Found LoRA roots:" + ("\n - " + "\n - ".join(unique_paths) if unique_paths else "[]"))
if not unique_paths:
raise ValueError("No valid loras folders found in ComfyUI configuration") if not unique_paths:
logger.warning("No valid loras folders found in ComfyUI configuration")
for original_path in unique_paths: return []
real_path = os.path.normpath(os.path.realpath(original_path)).replace(os.sep, '/')
if real_path != original_path: for original_path in unique_paths:
self.add_path_mapping(original_path, real_path) real_path = os.path.normpath(os.path.realpath(original_path)).replace(os.sep, '/')
if real_path != original_path:
return unique_paths self.add_path_mapping(original_path, real_path)
return unique_paths
except Exception as e:
logger.warning(f"Error initializing LoRA paths: {e}")
return []
def _init_checkpoint_paths(self) -> List[str]: def _init_checkpoint_paths(self) -> List[str]:
"""Initialize and validate checkpoint paths from ComfyUI settings""" """Initialize and validate checkpoint paths from ComfyUI settings"""
# Get checkpoint paths from folder_paths try:
checkpoint_paths = folder_paths.get_folder_paths("checkpoints") # Get checkpoint paths from folder_paths
diffusion_paths = folder_paths.get_folder_paths("diffusers") checkpoint_paths = folder_paths.get_folder_paths("checkpoints")
unet_paths = folder_paths.get_folder_paths("unet") diffusion_paths = folder_paths.get_folder_paths("diffusers")
unet_paths = folder_paths.get_folder_paths("unet")
# Combine all checkpoint-related paths
all_paths = checkpoint_paths + diffusion_paths + unet_paths # Combine all checkpoint-related paths
all_paths = checkpoint_paths + diffusion_paths + unet_paths
# Filter and normalize paths
paths = sorted(set(path.replace(os.sep, "/") # Filter and normalize paths
for path in all_paths paths = sorted(set(path.replace(os.sep, "/")
if os.path.exists(path)), key=lambda p: p.lower()) for path in all_paths
if os.path.exists(path)), key=lambda p: p.lower())
print("Found checkpoint roots:", paths)
logger.info("Found checkpoint roots:" + ("\n - " + "\n - ".join(paths) if paths else "[]"))
if not paths:
logger.warning("No valid checkpoint folders found in ComfyUI configuration") if not paths:
logger.warning("No valid checkpoint folders found in ComfyUI configuration")
return []
# 初始化路径映射,与 LoRA 路径处理方式相同
for path in paths:
real_path = os.path.normpath(os.path.realpath(path)).replace(os.sep, '/')
if real_path != path:
self.add_path_mapping(path, real_path)
return paths
except Exception as e:
logger.warning(f"Error initializing checkpoint paths: {e}")
return [] return []
# 初始化路径映射,与 LoRA 路径处理方式相同
for path in paths:
real_path = os.path.normpath(os.path.realpath(path)).replace(os.sep, '/')
if real_path != path:
self.add_path_mapping(path, real_path)
return paths
def get_preview_static_url(self, preview_path: str) -> str: def get_preview_static_url(self, preview_path: str) -> str:
"""Convert local preview path to static URL""" """Convert local preview path to static URL"""

View File

@@ -9,9 +9,13 @@ from .routes.update_routes import UpdateRoutes
from .routes.usage_stats_routes import UsageStatsRoutes from .routes.usage_stats_routes import UsageStatsRoutes
from .services.service_registry import ServiceRegistry from .services.service_registry import ServiceRegistry
import logging import logging
import sys
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
# Check if we're in standalone mode
STANDALONE_MODE = 'nodes' not in sys.modules
class LoraManager: class LoraManager:
"""Main entry point for LoRA Manager plugin""" """Main entry point for LoRA Manager plugin"""
@@ -20,6 +24,9 @@ class LoraManager:
"""Initialize and register all routes""" """Initialize and register all routes"""
app = PromptServer.instance.app app = PromptServer.instance.app
# Configure aiohttp access logger to be less verbose
logging.getLogger('aiohttp.access').setLevel(logging.WARNING)
added_targets = set() # Track already added target paths added_targets = set() # Track already added target paths
# Add static routes for each lora root # Add static routes for each lora root
@@ -108,6 +115,9 @@ class LoraManager:
async def _initialize_services(cls): async def _initialize_services(cls):
"""Initialize all services using the ServiceRegistry""" """Initialize all services using the ServiceRegistry"""
try: try:
# Ensure aiohttp access logger is configured with reduced verbosity
logging.getLogger('aiohttp.access').setLevel(logging.WARNING)
# Initialize CivitaiClient first to ensure it's ready for other services # Initialize CivitaiClient first to ensure it's ready for other services
civitai_client = await ServiceRegistry.get_civitai_client() civitai_client = await ServiceRegistry.get_civitai_client()
@@ -137,6 +147,12 @@ class LoraManager:
# Initialize recipe scanner if needed # Initialize recipe scanner if needed
recipe_scanner = await ServiceRegistry.get_recipe_scanner() recipe_scanner = await ServiceRegistry.get_recipe_scanner()
# Initialize metadata collector if not in standalone mode
if not STANDALONE_MODE:
from .metadata_collector import init as init_metadata
init_metadata()
logger.debug("Metadata collector initialized")
# Create low-priority initialization tasks # Create low-priority initialization tasks
asyncio.create_task(lora_scanner.initialize_in_background(), name='lora_cache_init') asyncio.create_task(lora_scanner.initialize_in_background(), name='lora_cache_init')
asyncio.create_task(checkpoint_scanner.initialize_in_background(), name='checkpoint_cache_init') asyncio.create_task(checkpoint_scanner.initialize_in_background(), name='checkpoint_cache_init')

View File

@@ -1,18 +1,32 @@
import os import os
import importlib import importlib
from .metadata_hook import MetadataHook import sys
from .metadata_registry import MetadataRegistry
def init(): # Check if running in standalone mode
# Install hooks to collect metadata during execution standalone_mode = 'nodes' not in sys.modules
MetadataHook.install()
if not standalone_mode:
# Initialize registry from .metadata_hook import MetadataHook
registry = MetadataRegistry() from .metadata_registry import MetadataRegistry
print("ComfyUI Metadata Collector initialized") def init():
# Install hooks to collect metadata during execution
def get_metadata(prompt_id=None): MetadataHook.install()
"""Helper function to get metadata from the registry"""
registry = MetadataRegistry() # Initialize registry
return registry.get_metadata(prompt_id) registry = MetadataRegistry()
print("ComfyUI Metadata Collector initialized")
def get_metadata(prompt_id=None):
"""Helper function to get metadata from the registry"""
registry = MetadataRegistry()
return registry.get_metadata(prompt_id)
else:
# Standalone mode - provide dummy implementations
def init():
print("ComfyUI Metadata Collector disabled in standalone mode")
def get_metadata(prompt_id=None):
"""Dummy implementation for standalone mode"""
return {}

View File

@@ -1,4 +1,8 @@
import json import json
import sys
# Check if running in standalone mode
standalone_mode = 'nodes' not in sys.modules
from .constants import MODELS, PROMPTS, SAMPLING, LORAS, SIZE from .constants import MODELS, PROMPTS, SAMPLING, LORAS, SIZE
@@ -11,7 +15,16 @@ class MetadataProcessor:
primary_sampler = None primary_sampler = None
primary_sampler_id = None primary_sampler_id = None
# First, check for KSamplerAdvanced with add_noise="enable" # First, check for SamplerCustomAdvanced
prompt = metadata.get("current_prompt")
if prompt and prompt.original_prompt:
for node_id, node_info in prompt.original_prompt.items():
if node_info.get("class_type") == "SamplerCustomAdvanced":
# Found a SamplerCustomAdvanced node
if node_id in metadata.get(SAMPLING, {}):
return node_id, metadata[SAMPLING][node_id]
# Next, check for KSamplerAdvanced with add_noise="enable"
for node_id, sampler_info in metadata.get(SAMPLING, {}).items(): for node_id, sampler_info in metadata.get(SAMPLING, {}).items():
parameters = sampler_info.get("parameters", {}) parameters = sampler_info.get("parameters", {})
add_noise = parameters.get("add_noise") add_noise = parameters.get("add_noise")
@@ -22,7 +35,7 @@ class MetadataProcessor:
primary_sampler_id = node_id primary_sampler_id = node_id
break break
# If no KSamplerAdvanced found, fall back to traditional KSampler with denoise=1 # If no specialized sampler found, fall back to traditional KSampler with denoise=1
if primary_sampler is None: if primary_sampler is None:
for node_id, sampler_info in metadata.get(SAMPLING, {}).items(): for node_id, sampler_info in metadata.get(SAMPLING, {}).items():
parameters = sampler_info.get("parameters", {}) parameters = sampler_info.get("parameters", {})
@@ -152,22 +165,60 @@ class MetadataProcessor:
# Trace connections from the primary sampler # Trace connections from the primary sampler
if prompt and primary_sampler_id: if prompt and primary_sampler_id:
# Trace positive prompt - look specifically for CLIPTextEncode # Check if this is a SamplerCustomAdvanced node
positive_node_id = MetadataProcessor.trace_node_input(prompt, primary_sampler_id, "positive", "CLIPTextEncode", max_depth=10) is_custom_advanced = False
if positive_node_id and positive_node_id in metadata.get(PROMPTS, {}): if prompt.original_prompt and primary_sampler_id in prompt.original_prompt:
params["prompt"] = metadata[PROMPTS][positive_node_id].get("text", "") is_custom_advanced = prompt.original_prompt[primary_sampler_id].get("class_type") == "SamplerCustomAdvanced"
# Find any FluxGuidance nodes in the positive conditioning path if is_custom_advanced:
flux_node_id = MetadataProcessor.trace_node_input(prompt, primary_sampler_id, "positive", "FluxGuidance", max_depth=5) # For SamplerCustomAdvanced, trace specific inputs
if flux_node_id and flux_node_id in metadata.get(SAMPLING, {}):
flux_params = metadata[SAMPLING][flux_node_id].get("parameters", {}) # 1. Trace sigmas input to find BasicScheduler
params["guidance"] = flux_params.get("guidance") scheduler_node_id = MetadataProcessor.trace_node_input(prompt, primary_sampler_id, "sigmas", "BasicScheduler", max_depth=5)
if scheduler_node_id and scheduler_node_id in metadata.get(SAMPLING, {}):
scheduler_params = metadata[SAMPLING][scheduler_node_id].get("parameters", {})
params["steps"] = scheduler_params.get("steps")
params["scheduler"] = scheduler_params.get("scheduler")
# 2. Trace sampler input to find KSamplerSelect
sampler_node_id = MetadataProcessor.trace_node_input(prompt, primary_sampler_id, "sampler", "KSamplerSelect", max_depth=5)
if sampler_node_id and sampler_node_id in metadata.get(SAMPLING, {}):
sampler_params = metadata[SAMPLING][sampler_node_id].get("parameters", {})
params["sampler"] = sampler_params.get("sampler_name")
# 3. Trace guider input for FluxGuidance and CLIPTextEncode
guider_node_id = MetadataProcessor.trace_node_input(prompt, primary_sampler_id, "guider", max_depth=5)
if guider_node_id:
# Look for FluxGuidance along the guider path
flux_node_id = MetadataProcessor.trace_node_input(prompt, guider_node_id, "conditioning", "FluxGuidance", max_depth=5)
if flux_node_id and flux_node_id in metadata.get(SAMPLING, {}):
flux_params = metadata[SAMPLING][flux_node_id].get("parameters", {})
params["guidance"] = flux_params.get("guidance")
# Find CLIPTextEncode for positive prompt (through conditioning)
positive_node_id = MetadataProcessor.trace_node_input(prompt, guider_node_id, "conditioning", "CLIPTextEncode", max_depth=10)
if positive_node_id and positive_node_id in metadata.get(PROMPTS, {}):
params["prompt"] = metadata[PROMPTS][positive_node_id].get("text", "")
# Trace negative prompt - look specifically for CLIPTextEncode else:
negative_node_id = MetadataProcessor.trace_node_input(prompt, primary_sampler_id, "negative", "CLIPTextEncode", max_depth=10) # Original tracing for standard samplers
if negative_node_id and negative_node_id in metadata.get(PROMPTS, {}): # Trace positive prompt - look specifically for CLIPTextEncode
params["negative_prompt"] = metadata[PROMPTS][negative_node_id].get("text", "") positive_node_id = MetadataProcessor.trace_node_input(prompt, primary_sampler_id, "positive", "CLIPTextEncode", max_depth=10)
if positive_node_id and positive_node_id in metadata.get(PROMPTS, {}):
params["prompt"] = metadata[PROMPTS][positive_node_id].get("text", "")
# Find any FluxGuidance nodes in the positive conditioning path
flux_node_id = MetadataProcessor.trace_node_input(prompt, primary_sampler_id, "positive", "FluxGuidance", max_depth=5)
if flux_node_id and flux_node_id in metadata.get(SAMPLING, {}):
flux_params = metadata[SAMPLING][flux_node_id].get("parameters", {})
params["guidance"] = flux_params.get("guidance")
# Trace negative prompt - look specifically for CLIPTextEncode
negative_node_id = MetadataProcessor.trace_node_input(prompt, primary_sampler_id, "negative", "CLIPTextEncode", max_depth=10)
if negative_node_id and negative_node_id in metadata.get(PROMPTS, {}):
params["negative_prompt"] = metadata[PROMPTS][negative_node_id].get("text", "")
# Size extraction is same for all sampler types
# Check if the sampler itself has size information (from latent_image) # Check if the sampler itself has size information (from latent_image)
if primary_sampler_id in metadata.get(SIZE, {}): if primary_sampler_id in metadata.get(SIZE, {}):
width = metadata[SIZE][primary_sampler_id].get("width") width = metadata[SIZE][primary_sampler_id].get("width")
@@ -229,6 +280,10 @@ class MetadataProcessor:
@staticmethod @staticmethod
def to_dict(metadata): def to_dict(metadata):
"""Convert extracted metadata to the ComfyUI output.json format""" """Convert extracted metadata to the ComfyUI output.json format"""
if standalone_mode:
# Return empty dictionary in standalone mode
return {}
params = MetadataProcessor.extract_generation_params(metadata) params = MetadataProcessor.extract_generation_params(metadata)
# Convert all values to strings to match output.json format # Convert all values to strings to match output.json format

View File

@@ -257,12 +257,85 @@ class VAEDecodeExtractor(NodeMetadataExtractor):
if "first_decode" not in metadata[IMAGES]: if "first_decode" not in metadata[IMAGES]:
metadata[IMAGES]["first_decode"] = metadata[IMAGES][node_id] metadata[IMAGES]["first_decode"] = metadata[IMAGES][node_id]
class KSamplerSelectExtractor(NodeMetadataExtractor):
@staticmethod
def extract(node_id, inputs, outputs, metadata):
if not inputs or "sampler_name" not in inputs:
return
sampling_params = {}
if "sampler_name" in inputs:
sampling_params["sampler_name"] = inputs["sampler_name"]
metadata[SAMPLING][node_id] = {
"parameters": sampling_params,
"node_id": node_id
}
class BasicSchedulerExtractor(NodeMetadataExtractor):
@staticmethod
def extract(node_id, inputs, outputs, metadata):
if not inputs:
return
sampling_params = {}
for key in ["scheduler", "steps", "denoise"]:
if key in inputs:
sampling_params[key] = inputs[key]
metadata[SAMPLING][node_id] = {
"parameters": sampling_params,
"node_id": node_id
}
class SamplerCustomAdvancedExtractor(NodeMetadataExtractor):
@staticmethod
def extract(node_id, inputs, outputs, metadata):
if not inputs:
return
sampling_params = {}
# Handle noise.seed as seed
if "noise" in inputs and inputs["noise"] is not None and hasattr(inputs["noise"], "seed"):
noise = inputs["noise"]
sampling_params["seed"] = noise.seed
metadata[SAMPLING][node_id] = {
"parameters": sampling_params,
"node_id": node_id
}
# Extract latent image dimensions if available
if "latent_image" in inputs and inputs["latent_image"] is not None:
latent = inputs["latent_image"]
if isinstance(latent, dict) and "samples" in latent:
# Extract dimensions from latent tensor
samples = latent["samples"]
if hasattr(samples, "shape") and len(samples.shape) >= 3:
# Correct shape interpretation: [batch_size, channels, height/8, width/8]
# Multiply by 8 to get actual pixel dimensions
height = int(samples.shape[2] * 8)
width = int(samples.shape[3] * 8)
if SIZE not in metadata:
metadata[SIZE] = {}
metadata[SIZE][node_id] = {
"width": width,
"height": height,
"node_id": node_id
}
# Registry of node-specific extractors # Registry of node-specific extractors
NODE_EXTRACTORS = { NODE_EXTRACTORS = {
# Sampling # Sampling
"KSampler": SamplerExtractor, "KSampler": SamplerExtractor,
"KSamplerAdvanced": KSamplerAdvancedExtractor, # Add KSamplerAdvanced "KSamplerAdvanced": KSamplerAdvancedExtractor,
"SamplerCustomAdvanced": SamplerExtractor, # Add SamplerCustomAdvanced "SamplerCustomAdvanced": SamplerCustomAdvancedExtractor, # Updated to use dedicated extractor
# Sampling Selectors
"KSamplerSelect": KSamplerSelectExtractor, # Add KSamplerSelect
"BasicScheduler": BasicSchedulerExtractor, # Add BasicScheduler
# Loaders # Loaders
"CheckpointLoaderSimple": CheckpointLoaderExtractor, "CheckpointLoaderSimple": CheckpointLoaderExtractor,
"UNETLoader": UNETLoaderExtractor, # Updated to use dedicated extractor "UNETLoader": UNETLoaderExtractor, # Updated to use dedicated extractor

View File

@@ -10,16 +10,25 @@ from typing import Dict
import tempfile import tempfile
import json import json
import asyncio import asyncio
import sys
from ..utils.exif_utils import ExifUtils from ..utils.exif_utils import ExifUtils
from ..utils.recipe_parsers import RecipeParserFactory from ..utils.recipe_parsers import RecipeParserFactory
from ..utils.constants import CARD_PREVIEW_WIDTH from ..utils.constants import CARD_PREVIEW_WIDTH
from ..config import config from ..config import config
from ..metadata_collector import get_metadata # Add MetadataCollector import
from ..metadata_collector.metadata_processor import MetadataProcessor # Add MetadataProcessor import # Check if running in standalone mode
standalone_mode = 'nodes' not in sys.modules
from ..utils.utils import download_civitai_image from ..utils.utils import download_civitai_image
from ..services.service_registry import ServiceRegistry # Add ServiceRegistry import from ..services.service_registry import ServiceRegistry # Add ServiceRegistry import
from ..metadata_collector.metadata_registry import MetadataRegistry
# Only import MetadataRegistry in non-standalone mode
if not standalone_mode:
# Import metadata_collector functions and classes conditionally
from ..metadata_collector import get_metadata # Add MetadataCollector import
from ..metadata_collector.metadata_processor import MetadataProcessor # Add MetadataProcessor import
from ..metadata_collector.metadata_registry import MetadataRegistry
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@@ -804,8 +813,11 @@ class RecipeRoutes:
return web.json_response({"error": "No generation metadata found"}, status=400) return web.json_response({"error": "No generation metadata found"}, status=400)
# Get the most recent image from metadata registry instead of temp directory # Get the most recent image from metadata registry instead of temp directory
metadata_registry = MetadataRegistry() if not standalone_mode:
latest_image = metadata_registry.get_first_decoded_image() metadata_registry = MetadataRegistry()
latest_image = metadata_registry.get_first_decoded_image()
else:
latest_image = None
if not latest_image: if not latest_image:
return web.json_response({"error": "No recent images found to use for recipe. Try generating an image first."}, status=400) return web.json_response({"error": "No recent images found to use for recipe. Try generating an image first."}, status=400)

View File

@@ -403,27 +403,43 @@ class StandardMetadataParser(RecipeMetadataParser):
# Extract Civitai resources # Extract Civitai resources
if 'Civitai resources:' in user_comment: if 'Civitai resources:' in user_comment:
resources_part = user_comment.split('Civitai resources:', 1)[1] resources_part = user_comment.split('Civitai resources:', 1)[1].strip()
if '],' in resources_part:
resources_json = resources_part.split('],', 1)[0] + ']' # Look for the opening and closing brackets to extract the JSON array
try: if resources_part.startswith('['):
resources = json.loads(resources_json) # Find the position of the closing bracket
# Filter loras and checkpoints bracket_count = 0
for resource in resources: end_pos = -1
if resource.get('type') == 'lora':
# 确保 weight 字段被正确保留 for i, char in enumerate(resources_part):
lora_entry = resource.copy() if char == '[':
# 如果找不到 weight默认为 1.0 bracket_count += 1
if 'weight' not in lora_entry: elif char == ']':
lora_entry['weight'] = 1.0 bracket_count -= 1
# Ensure modelVersionName is included if bracket_count == 0:
if 'modelVersionName' not in lora_entry: end_pos = i
lora_entry['modelVersionName'] = '' break
metadata['loras'].append(lora_entry)
elif resource.get('type') == 'checkpoint': if end_pos != -1:
metadata['checkpoint'] = resource resources_json = resources_part[:end_pos+1]
except json.JSONDecodeError: try:
pass resources = json.loads(resources_json)
# Filter loras and checkpoints
for resource in resources:
if resource.get('type') == 'lora':
# 确保 weight 字段被正确保留
lora_entry = resource.copy()
# 如果找不到 weight默认为 1.0
if 'weight' not in lora_entry:
lora_entry['weight'] = 1.0
# Ensure modelVersionName is included
if 'modelVersionName' not in lora_entry:
lora_entry['modelVersionName'] = ''
metadata['loras'].append(lora_entry)
elif resource.get('type') == 'checkpoint':
metadata['checkpoint'] = resource
except json.JSONDecodeError:
pass
return metadata return metadata
except Exception as e: except Exception as e:

View File

@@ -1,5 +1,6 @@
import os import os
import json import json
import sys
import time import time
import asyncio import asyncio
import logging import logging
@@ -7,8 +8,13 @@ from typing import Dict, Set
from ..config import config from ..config import config
from ..services.service_registry import ServiceRegistry from ..services.service_registry import ServiceRegistry
from ..metadata_collector.metadata_registry import MetadataRegistry
from ..metadata_collector.constants import MODELS, LORAS # Check if running in standalone mode
standalone_mode = 'nodes' not in sys.modules
if not standalone_mode:
from ..metadata_collector.metadata_registry import MetadataRegistry
from ..metadata_collector.constants import MODELS, LORAS
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View File

@@ -1,7 +1,7 @@
[project] [project]
name = "comfyui-lora-manager" name = "comfyui-lora-manager"
description = "LoRA Manager for ComfyUI - Access it at http://localhost:8188/loras for managing LoRA models with previews and metadata integration." description = "LoRA Manager for ComfyUI - Access it at http://localhost:8188/loras for managing LoRA models with previews and metadata integration."
version = "0.8.9" version = "0.8.10"
license = {file = "LICENSE"} license = {file = "LICENSE"}
dependencies = [ dependencies = [
"aiohttp", "aiohttp",

View File

@@ -7,4 +7,6 @@ piexif
Pillow Pillow
olefile olefile
requests requests
toml toml
numpy
torch

14
settings.json.example Normal file
View File

@@ -0,0 +1,14 @@
{
"civitai_api_key": "your_civitai_api_key_here",
"show_only_sfw": false,
"folder_paths": {
"loras": [
"C:/path/to/your/loras_folder",
"C:/path/to/another/loras_folder"
],
"checkpoints": [
"C:/path/to/your/checkpoints_folder",
"C:/path/to/another/checkpoints_folder"
]
}
}

347
standalone.py Normal file
View File

@@ -0,0 +1,347 @@
import os
import sys
import json
# Create mock folder_paths module BEFORE any other imports
class MockFolderPaths:
@staticmethod
def get_folder_paths(folder_name):
# Load paths from settings.json
settings_path = os.path.join(os.path.dirname(__file__), 'settings.json')
try:
if os.path.exists(settings_path):
with open(settings_path, 'r', encoding='utf-8') as f:
settings = json.load(f)
# For diffusion_models, combine unet and diffusers paths
if folder_name == "diffusion_models":
paths = []
if 'folder_paths' in settings:
if 'unet' in settings['folder_paths']:
paths.extend(settings['folder_paths']['unet'])
if 'diffusers' in settings['folder_paths']:
paths.extend(settings['folder_paths']['diffusers'])
# Filter out paths that don't exist
valid_paths = [p for p in paths if os.path.exists(p)]
if valid_paths:
return valid_paths
else:
print(f"Warning: No valid paths found for {folder_name}")
# For other folder names, return their paths directly
elif 'folder_paths' in settings and folder_name in settings['folder_paths']:
paths = settings['folder_paths'][folder_name]
valid_paths = [p for p in paths if os.path.exists(p)]
if valid_paths:
return valid_paths
else:
print(f"Warning: No valid paths found for {folder_name}")
except Exception as e:
print(f"Error loading folder paths from settings: {e}")
# Fallback to empty list if no paths found
return []
@staticmethod
def get_temp_directory():
return os.path.join(os.path.dirname(__file__), 'temp')
@staticmethod
def set_temp_directory(path):
os.makedirs(path, exist_ok=True)
return path
# Create mock server module with PromptServer
class MockPromptServer:
def __init__(self):
self.app = None
def send_sync(self, *args, **kwargs):
pass
# Create mock metadata_collector module
class MockMetadataCollector:
def init(self):
pass
def get_metadata(self, prompt_id=None):
return {}
# Initialize basic mocks before any imports
sys.modules['folder_paths'] = MockFolderPaths()
sys.modules['server'] = type('server', (), {'PromptServer': MockPromptServer()})
sys.modules['py.metadata_collector'] = MockMetadataCollector()
# Now we can safely import modules that depend on folder_paths and server
import argparse
import asyncio
import logging
from aiohttp import web
# Setup logging
logging.basicConfig(level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s')
logger = logging.getLogger("lora-manager-standalone")
# Configure aiohttp access logger to be less verbose
logging.getLogger('aiohttp.access').setLevel(logging.WARNING)
# Now we can import the global config from our local modules
from py.config import config
class StandaloneServer:
"""Server implementation for standalone mode"""
def __init__(self):
self.app = web.Application(logger=logger)
self.instance = self # Make it compatible with PromptServer.instance pattern
# Ensure the app's access logger is configured to reduce verbosity
self.app._subapps = [] # Ensure this exists to avoid AttributeError
# Configure access logging for the app
self.app.on_startup.append(self._configure_access_logger)
async def _configure_access_logger(self, app):
"""Configure access logger to reduce verbosity"""
logging.getLogger('aiohttp.access').setLevel(logging.WARNING)
# If using aiohttp>=3.8.0, configure access logger through app directly
if hasattr(app, 'access_logger'):
app.access_logger.setLevel(logging.WARNING)
async def setup(self):
"""Set up the standalone server"""
# Create placeholders for compatibility with ComfyUI's implementation
self.last_prompt_id = None
self.last_node_id = None
self.client_id = None
# Set up routes
self.setup_routes()
# Add startup and shutdown handlers
self.app.on_startup.append(self.on_startup)
self.app.on_shutdown.append(self.on_shutdown)
def setup_routes(self):
"""Set up basic routes"""
# Add a simple status endpoint
self.app.router.add_get('/', self.handle_status)
async def handle_status(self, request):
"""Handle status request by redirecting to loras page"""
# Redirect to loras page instead of showing status
raise web.HTTPFound('/loras')
# Original JSON response (commented out)
# return web.json_response({
# "status": "running",
# "mode": "standalone",
# "loras_roots": config.loras_roots,
# "checkpoints_roots": config.checkpoints_roots
# })
async def on_startup(self, app):
"""Startup handler"""
logger.info("LoRA Manager standalone server starting...")
async def on_shutdown(self, app):
"""Shutdown handler"""
logger.info("LoRA Manager standalone server shutting down...")
def send_sync(self, event_type, data, sid=None):
"""Stub for compatibility with PromptServer"""
# In standalone mode, we don't have the same websocket system
pass
async def start(self, host='127.0.0.1', port=8188):
"""Start the server"""
runner = web.AppRunner(self.app)
await runner.setup()
site = web.TCPSite(runner, host, port)
await site.start()
# Log the server address with a clickable localhost URL regardless of the actual binding
logger.info(f"Server started at http://127.0.0.1:{port}")
# Keep the server running
while True:
await asyncio.sleep(3600) # Sleep for a long time
async def publish_loop(self):
"""Stub for compatibility with PromptServer"""
# This method exists in ComfyUI's server but we don't need it
pass
# After all mocks are in place, import LoraManager
from py.lora_manager import LoraManager
class StandaloneLoraManager(LoraManager):
"""Extended LoraManager for standalone mode"""
@classmethod
def add_routes(cls, server_instance):
"""Initialize and register all routes for standalone mode"""
app = server_instance.app
# Store app in a global-like location for compatibility
sys.modules['server'].PromptServer.instance = server_instance
# Configure aiohttp access logger to be less verbose
logging.getLogger('aiohttp.access').setLevel(logging.WARNING)
added_targets = set() # Track already added target paths
# Add static routes for each lora root
for idx, root in enumerate(config.loras_roots, start=1):
if not os.path.exists(root):
logger.warning(f"Lora root path does not exist: {root}")
continue
preview_path = f'/loras_static/root{idx}/preview'
# Check if this root is a link path in the mappings
real_root = root
for target, link in config._path_mappings.items():
if os.path.normpath(link) == os.path.normpath(root):
# If so, route should point to the target (real path)
real_root = target
break
# Normalize and standardize path display for consistency
display_root = real_root.replace('\\', '/')
# Add static route for original path - use the normalized path
app.router.add_static(preview_path, real_root)
logger.info(f"Added static route {preview_path} -> {display_root}")
# Record route mapping with normalized path
config.add_route_mapping(real_root, preview_path)
added_targets.add(os.path.normpath(real_root))
# Add static routes for each checkpoint root
for idx, root in enumerate(config.checkpoints_roots, start=1):
if not os.path.exists(root):
logger.warning(f"Checkpoint root path does not exist: {root}")
continue
preview_path = f'/checkpoints_static/root{idx}/preview'
# Check if this root is a link path in the mappings
real_root = root
for target, link in config._path_mappings.items():
if os.path.normpath(link) == os.path.normpath(root):
# If so, route should point to the target (real path)
real_root = target
break
# Normalize and standardize path display for consistency
display_root = real_root.replace('\\', '/')
# Add static route for original path
app.router.add_static(preview_path, real_root)
logger.info(f"Added static route {preview_path} -> {display_root}")
# Record route mapping
config.add_route_mapping(real_root, preview_path)
added_targets.add(os.path.normpath(real_root))
# Add static routes for symlink target paths that aren't already covered
link_idx = {
'lora': 1,
'checkpoint': 1
}
for target_path, link_path in config._path_mappings.items():
norm_target = os.path.normpath(target_path)
if norm_target not in added_targets:
# Determine if this is a checkpoint or lora link based on path
is_checkpoint = any(os.path.normpath(cp_root) in os.path.normpath(link_path) for cp_root in config.checkpoints_roots)
is_checkpoint = is_checkpoint or any(os.path.normpath(cp_root) in norm_target for cp_root in config.checkpoints_roots)
if is_checkpoint:
route_path = f'/checkpoints_static/link_{link_idx["checkpoint"]}/preview'
link_idx["checkpoint"] += 1
else:
route_path = f'/loras_static/link_{link_idx["lora"]}/preview'
link_idx["lora"] += 1
# Display path with forward slashes for consistency
display_target = target_path.replace('\\', '/')
app.router.add_static(route_path, target_path)
logger.info(f"Added static route for link target {route_path} -> {display_target}")
config.add_route_mapping(target_path, route_path)
added_targets.add(norm_target)
# Add static route for plugin assets
app.router.add_static('/loras_static', config.static_path)
# Setup feature routes
from py.routes.lora_routes import LoraRoutes
from py.routes.api_routes import ApiRoutes
from py.routes.recipe_routes import RecipeRoutes
from py.routes.checkpoints_routes import CheckpointsRoutes
from py.routes.update_routes import UpdateRoutes
from py.routes.usage_stats_routes import UsageStatsRoutes
lora_routes = LoraRoutes()
checkpoints_routes = CheckpointsRoutes()
# Initialize routes
lora_routes.setup_routes(app)
checkpoints_routes.setup_routes(app)
ApiRoutes.setup_routes(app)
RecipeRoutes.setup_routes(app)
UpdateRoutes.setup_routes(app)
UsageStatsRoutes.setup_routes(app)
# Schedule service initialization
app.on_startup.append(lambda app: cls._initialize_services())
# Add cleanup
app.on_shutdown.append(cls._cleanup)
app.on_shutdown.append(ApiRoutes.cleanup)
def parse_args():
"""Parse command line arguments"""
parser = argparse.ArgumentParser(description="LoRA Manager Standalone Server")
parser.add_argument("--host", type=str, default="0.0.0.0",
help="Host address to bind the server to (default: 0.0.0.0)")
parser.add_argument("--port", type=int, default=8188,
help="Port to bind the server to (default: 8188, access via http://localhost:8188/loras)")
# parser.add_argument("--loras", type=str, nargs="+",
# help="Additional paths to LoRA model directories (optional if settings.json has paths)")
# parser.add_argument("--checkpoints", type=str, nargs="+",
# help="Additional paths to checkpoint model directories (optional if settings.json has paths)")
parser.add_argument("--log-level", type=str, default="INFO",
choices=["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"],
help="Logging level")
return parser.parse_args()
async def main():
"""Main entry point for standalone mode"""
args = parse_args()
# Set log level
logging.getLogger().setLevel(getattr(logging, args.log_level))
# Explicitly configure aiohttp access logger regardless of selected log level
logging.getLogger('aiohttp.access').setLevel(logging.WARNING)
# Create the server instance
server = StandaloneServer()
# Initialize routes via the standalone lora manager
StandaloneLoraManager.add_routes(server)
# Set up and start the server
await server.setup()
await server.start(host=args.host, port=args.port)
if __name__ == "__main__":
try:
# Run the main function
asyncio.run(main())
except KeyboardInterrupt:
logger.info("Server stopped by user")

View File

@@ -146,6 +146,18 @@ export class ImportManager {
if (totalSizeDisplay) { if (totalSizeDisplay) {
totalSizeDisplay.textContent = 'Calculating...'; totalSizeDisplay.textContent = 'Calculating...';
} }
// Remove any existing deleted LoRAs warning
const deletedLorasWarning = document.getElementById('deletedLorasWarning');
if (deletedLorasWarning) {
deletedLorasWarning.remove();
}
// Remove any existing early access warning
const earlyAccessWarning = document.getElementById('earlyAccessWarning');
if (earlyAccessWarning) {
earlyAccessWarning.remove();
}
} }
toggleImportMode(mode) { toggleImportMode(mode) {
@@ -532,17 +544,17 @@ export class ImportManager {
const nextButton = document.querySelector('#detailsStep .primary-btn'); const nextButton = document.querySelector('#detailsStep .primary-btn');
if (!nextButton) return; if (!nextButton) return;
// Always clean up previous warnings first
const existingWarning = document.getElementById('deletedLorasWarning');
if (existingWarning) {
existingWarning.remove();
}
// Count deleted LoRAs // Count deleted LoRAs
const deletedLoras = this.recipeData.loras.filter(lora => lora.isDeleted).length; const deletedLoras = this.recipeData.loras.filter(lora => lora.isDeleted).length;
// If we have deleted LoRAs, show a warning and update button text // If we have deleted LoRAs, show a warning and update button text
if (deletedLoras > 0) { if (deletedLoras > 0) {
// Remove any existing warning
const existingWarning = document.getElementById('deletedLorasWarning');
if (existingWarning) {
existingWarning.remove();
}
// Create a new warning container above the buttons // Create a new warning container above the buttons
const buttonsContainer = document.querySelector('#detailsStep .modal-actions') || nextButton.parentNode; const buttonsContainer = document.querySelector('#detailsStep .modal-actions') || nextButton.parentNode;
const warningContainer = document.createElement('div'); const warningContainer = document.createElement('div');

View File

@@ -900,7 +900,7 @@ export function addLorasWidget(node, name, opts, callback) {
}); });
// Calculate height based on number of loras and fixed sizes // Calculate height based on number of loras and fixed sizes
const calculatedHeight = CONTAINER_PADDING + HEADER_HEIGHT + (lorasData.length * LORA_ENTRY_HEIGHT); const calculatedHeight = CONTAINER_PADDING + HEADER_HEIGHT + (Math.min(lorasData.length, 5) * LORA_ENTRY_HEIGHT);
updateWidgetHeight(calculatedHeight); updateWidgetHeight(calculatedHeight);
}; };

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long