refactor: move No LoRA feature from LoRA Pool to Lora Cycler widget

Move the 'empty/no LoRA' cycling functionality from the LoRA Pool node
to the Lora Cycler widget for cleaner architecture:

Frontend changes:
- Add include_no_lora field to CyclerConfig interface
- Add includeNoLora state and logic to useLoraCyclerState composable
- Add toggle UI in LoraCyclerSettingsView with special styling
- Show 'No LoRA' entry in LoraListModal when enabled
- Update LoraCyclerWidget to integrate new logic

Backend changes:
- lora_cycler.py reads include_no_lora from config
- Calculate effective_total_count (actual count + 1 when enabled)
- Return empty lora_stack when on No LoRA position
- Return actual LoRA count in total_count (not effective count)

Reverted files to pre-PR state:
- lora_loader.py, lora_pool.py, lora_randomizer.py, lora_stacker.py
- lora_routes.py, lora_service.py
- LoraPoolWidget.vue and related files

Related to PR #861

Co-authored-by: dogatech <dogatech@dogatech.home>
This commit is contained in:
Will Miao
2026-03-19 14:19:49 +08:00
parent 8dd849892d
commit 1ae1b0d607
22 changed files with 459 additions and 316 deletions

View File

@@ -53,8 +53,6 @@ class LoraLoaderLM:
# First process lora_stack if available
if lora_stack:
for lora_path, model_strength, clip_strength in lora_stack:
if lora_path == "None" or not lora_path:
continue
# Extract lora name and convert to absolute path
# lora_stack stores relative paths, but load_torch_file needs absolute paths
lora_name = extract_lora_name(lora_path)
@@ -80,7 +78,7 @@ class LoraLoaderLM:
# Then process loras from kwargs with support for both old and new formats
loras_list = get_loras_list(kwargs)
for lora in loras_list:
if not lora.get('active', False) or lora.get('name') == "None":
if not lora.get('active', False):
continue
lora_name = lora['name']
@@ -199,8 +197,6 @@ class LoraTextLoaderLM:
# First process lora_stack if available
if lora_stack:
for lora_path, model_strength, clip_strength in lora_stack:
if lora_path == "None" or not lora_path:
continue
# Extract lora name and convert to absolute path
# lora_stack stores relative paths, but load_torch_file needs absolute paths
lora_name = extract_lora_name(lora_path)
@@ -227,8 +223,6 @@ class LoraTextLoaderLM:
parsed_loras = self.parse_lora_syntax(lora_syntax)
for lora in parsed_loras:
lora_name = lora['name']
if lora_name == "None":
continue
model_strength = lora['model_strength']
clip_strength = lora['clip_strength']