This commit is contained in:
justumen
2024-10-03 12:25:15 +02:00
parent ae0ac9e0a9
commit 2ca88b1615
11 changed files with 383 additions and 8 deletions

View File

@@ -1,6 +1,6 @@
# 🔗 Comfyui : Bjornulf_custom_nodes v0.46 🔗
# 🔗 Comfyui : Bjornulf_custom_nodes v0.48 🔗
A list of 53 custom nodes for Comfyui : Display, manipulate, and edit text, images, videos, and more.
A list of 55 custom nodes for Comfyui : Display, manipulate, and edit text, images, videos, and more.
You can manage looping operations, generate randomized content, trigger logical conditions, pause and manually control your workflows and even work with external AI tools, like Ollama or Text To Speech.
# Coffee : ☕☕☕☕☕ 5/5
@@ -43,6 +43,7 @@ You can manage looping operations, generate randomized content, trigger logical
`39.` [♻ Loop (✒🗔 Advanced Write Text + 🅰️ variables)](#39----loop--advanced-write-text)
`42.` [♻ Loop (Model+Clip+Vae) - aka Checkpoint / Model](#42----loop-modelclipvae---aka-checkpoint--model)
`53.` [♻ Loop Load checkpoint (Model Selector)](#53----loop-load-checkpoint-model-selector)
`54.` [♻ Loop Lora Selector](#54)
## 🎲 Randomization 🎲
`3.` [✒🗔 Advanced Write Text (+ 🎲 random selection and 🅰️ variables)](#3----advanced-write-text---random-selection-and-🅰%EF%B8%8F-variables)
@@ -53,6 +54,7 @@ You can manage looping operations, generate randomized content, trigger logical
`40.` [🎲 Random (Model+Clip+Vae) - aka Checkpoint / Model](#40----random-modelclipvae---aka-checkpoint--model)
`41.` [🎲 Random Load checkpoint (Model Selector)](#41----random-load-checkpoint-model-selector)
`48.` [🔀🎲 Text scrambler (🧑 Character)](#48----text-scrambler--character)
`55.` [🎲 Random Lora Selector](#55)
## 🖼💾 Image Save 💾🖼
`16.` [💾🖼💬 Save image for Bjornulf LobeChat](#16----save-image-for-bjornulf-lobechat-for-my-custom-lobe-chat)
@@ -79,12 +81,14 @@ You can manage looping operations, generate randomized content, trigger logical
`46.` [🖼🔍 Image Details](#46----image-details)
`47.` [🖼 Combine Images](#47----combine-images)
## 🚀 Checkpoints / Models 🚀
## 🚀 Load checkpoints 🚀
`40.` [🎲 Random (Model+Clip+Vae) - aka Checkpoint / Model](#40----random-modelclipvae---aka-checkpoint--model)
`41.` [🎲 Random Load checkpoint (Model Selector)](#41----random-load-checkpoint-model-selector)
`42.` [♻ Loop (Model+Clip+Vae) - aka Checkpoint / Model](#42----loop-modelclipvae---aka-checkpoint--model)
`53.` [♻ Loop Load checkpoint (Model Selector)](#53----loop-load-checkpoint-model-selector)
## 🚀 Load loras 🚀
## 📹 Video 📹
`20.` [📹 Video Ping Pong](#20----video-ping-pong)
`21.` [📹 Images to Video (FFmpeg)](#21----images-to-video)
@@ -93,7 +97,7 @@ You can manage looping operations, generate randomized content, trigger logical
`51.` [📹➜🖼 Video Path to Images](#51----video-path-to-images)
`52.` [🔊📹 Audio Video Sync](#52----audio-video-sync)
## 🦙 AI 🦙
## 🤖 AI 🤖
`19.` [🦙 Ollama](#19----ollama)
`31.` [🔊 TTS - Text to Speech](#31----tts---text-to-speech-100-local-any-voice-you-want-any-language)
@@ -113,8 +117,8 @@ You can manage looping operations, generate randomized content, trigger logical
# ☁ Usage in cloud :
Comfyui is great for local usage, but even me I sometimes need more power...
I have a computer with a 4070 super with 12GB and flux fp8 simple wokflow take about ~40 seconds. With a 4090 in the cloud I can run flux fp16 in ~12 seconds.
Comfyui is great for local usage, but I sometimes need more power than what I have...
I have a computer with a 4070 super with 12GB and flux fp8 simple wokflow take about ~40 seconds. With a 4090 in the cloud I can run flux fp16 in ~12 seconds. (There are of course also some workflow that I can't even run locally.)
My referal link for Runpod : <https://runpod.io?ref=tkowk7g5> (If you use that i will have a commission, at no extra cost for you.)
If you want to use my nodes and comfyui in the cloud (and can install more stuff), I'm managing an optimized ready-to-use template on runpod : <https://runpod.io/console/deploy?template=r32dtr35u1&ref=tkowk7g5>
@@ -242,6 +246,7 @@ cd /where/you/installed/ComfyUI && python main.py
- **v0.45**: Add a new node : Text scrambler (Character), change text randomly using the file `scrambler/scrambler_character.json` in the comfyui custom nodes folder.
- **v0.46**: ❗ A lot of changes to Video nodes. Save to video is now using FLOAT for fps, not INT. (A lot of other custom nodes do that as well...) Add node to preview video, add node to convert a video path to a list of images. add node to convert a list of images to a temporary video + video_path. add node to synchronize duration of audio with video. (useful for MuseTalk) change TTS node with many new outputs ("audio_path", "full_path", "duration") to reuse with other nodes like MuseTalk, also TTS rename input to "connect_to_workflow", to avoid mistakes sending text to it.
- **v0.47**: New node : Loop Load checkpoint (Model Selector).
- **v0.48**: Two new nodes for loras : Random Lora Selector and Loop Lora Selector.
# 📝 Nodes descriptions
@@ -838,3 +843,22 @@ It will loop over all the selected checkpoints.
❗ The big difference with 41 is that checkpoints are preloaded in memory. You can run them all faster all at once.
It is a good way to test multiple checkpoints quickly.
### 54 - ♻ Loop Lora Selector
![loop lora selector](screenshots/loop_lora_selector.png)
**Description:**
Loop over all the selected Loras.
Above is an example with Pony and several styles of Lora.
Below is another example, here with flux, to test if your Lora training was undertrained, overtrained or just right :
![loop lora selector](screenshots/loop_lora_selector_flux.png)
### 55 - 🎲 Random Lora Selector
![random lora selector](screenshots/random_lora_selector.png)
**Description:**
Just take a single Lora at random from a list of Loras.

View File

@@ -56,9 +56,13 @@ from .video_path_to_images import VideoToImagesList
from .images_to_video_path import ImagesListToVideo
from .video_preview import VideoPreview
from .loop_model_selector import LoopModelSelector
from .random_lora_selector import RandomLoraSelector
from .loop_lora_selector import LoopLoraSelector
NODE_CLASS_MAPPINGS = {
"Bjornulf_ollamaLoader": ollamaLoader,
"Bjornulf_LoopLoraSelector": LoopLoraSelector,
"Bjornulf_RandomLoraSelector": RandomLoraSelector,
"Bjornulf_LoopModelSelector": LoopModelSelector,
"Bjornulf_VideoPreview": VideoPreview,
"Bjornulf_ImagesListToVideo": ImagesListToVideo,
@@ -116,6 +120,8 @@ NODE_CLASS_MAPPINGS = {
NODE_DISPLAY_NAME_MAPPINGS = {
"Bjornulf_WriteText": "✒ Write Text",
"Bjornulf_LoopLoraSelector": "♻ Loop Lora Selector",
"Bjornulf_RandomLoraSelector": "🎲 Random Lora Selector",
"Bjornulf_LoopModelSelector": "♻ Loop Load checkpoint (Model Selector)",
"Bjornulf_VideoPreview": "📹👁 Video Preview",
"Bjornulf_ImagesListToVideo": "🖼➜📹 Images to Video path (tmp video)",

71
loop_lora_selector.py Normal file
View File

@@ -0,0 +1,71 @@
import os
from folder_paths import get_filename_list, get_full_path
import comfy.sd
import comfy.utils
class LoopLoraSelector:
@classmethod
def INPUT_TYPES(cls):
lora_list = get_filename_list("loras")
optional_inputs = {}
for i in range(1, 21):
optional_inputs[f"lora_{i}"] = (lora_list, {"default": lora_list[min(i-1, len(lora_list)-1)]})
optional_inputs[f"strength_model_{i}"] = ("FLOAT", {"default": 1.0, "min": -100.0, "max": 100.0, "step": 0.01})
optional_inputs[f"strength_clip_{i}"] = ("FLOAT", {"default": 1.0, "min": -100.0, "max": 100.0, "step": 0.01})
return {
"required": {
"number_of_loras": ("INT", {"default": 3, "min": 1, "max": 20, "step": 1}),
"model": ("MODEL",),
"clip": ("CLIP",),
},
"optional": optional_inputs
}
RETURN_TYPES = ("MODEL", "CLIP", "STRING", "STRING", "STRING")
RETURN_NAMES = ("model", "clip", "lora_path", "lora_name", "lora_folder")
FUNCTION = "loop_select_lora"
CATEGORY = "Bjornulf"
OUTPUT_IS_LIST = (True, True, True, True, True)
def loop_select_lora(self, number_of_loras, model, clip, **kwargs):
available_loras = []
strengths_model = []
strengths_clip = []
for i in range(1, number_of_loras + 1):
lora_key = f"lora_{i}"
strength_model_key = f"strength_model_{i}"
strength_clip_key = f"strength_clip_{i}"
if lora_key in kwargs and kwargs[lora_key]:
available_loras.append(kwargs[lora_key])
strengths_model.append(kwargs.get(strength_model_key, 1.0))
strengths_clip.append(kwargs.get(strength_clip_key, 1.0))
if not available_loras:
raise ValueError("No Loras selected")
models = []
clips = []
lora_paths = []
lora_names = []
lora_folders = []
for selected_lora, strength_model, strength_clip in zip(available_loras, strengths_model, strengths_clip):
lora_name = os.path.splitext(os.path.basename(selected_lora))[0]
lora_path = get_full_path("loras", selected_lora)
lora_folder = os.path.basename(os.path.dirname(lora_path))
lora = comfy.utils.load_torch_file(lora_path, safe_load=True)
model_lora, clip_lora = comfy.sd.load_lora_for_models(model, clip, lora, strength_model, strength_clip)
models.append(model_lora)
clips.append(clip_lora)
lora_paths.append(lora_path)
lora_names.append(lora_name)
lora_folders.append(lora_folder)
return (models, clips, lora_paths, lora_names, lora_folders)

View File

@@ -1,7 +1,7 @@
[project]
name = "bjornulf_custom_nodes"
description = "Nodes: Ollama, Text to Speech, Combine Texts, Random Texts, Save image for Bjornulf LobeChat, Text with random Seed, Random line from input, Combine images, Image to grayscale (black & white), Remove image Transparency (alpha), Resize Image, ..."
version = "0.47"
version = "0.48"
license = {file = "LICENSE"}
[project.urls]

64
random_lora_selector.py Normal file
View File

@@ -0,0 +1,64 @@
import os
import random
from folder_paths import get_filename_list, get_full_path
import comfy.sd
import comfy.utils
class RandomLoraSelector:
@classmethod
def INPUT_TYPES(cls):
lora_list = get_filename_list("loras")
optional_inputs = {}
for i in range(1, 11):
optional_inputs[f"lora_{i}"] = (lora_list, {"default": lora_list[min(i-1, len(lora_list)-1)]})
optional_inputs["seed"] = ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff})
return {
"required": {
"number_of_loras": ("INT", {"default": 3, "min": 1, "max": 20, "step": 1}),
"model": ("MODEL",),
"clip": ("CLIP",),
"strength_model": ("FLOAT", {"default": 1.0, "min": -100.0, "max": 100.0, "step": 0.01}),
"strength_clip": ("FLOAT", {"default": 1.0, "min": -100.0, "max": 100.0, "step": 0.01}),
},
"optional": optional_inputs
}
RETURN_TYPES = ("MODEL", "CLIP", "STRING", "STRING", "STRING")
RETURN_NAMES = ("model", "clip", "lora_path", "lora_name", "lora_folder")
FUNCTION = "random_select_lora"
CATEGORY = "Bjornulf"
def random_select_lora(self, number_of_loras, model, clip, strength_model, strength_clip, seed, **kwargs):
random.seed(seed)
# Collect available Loras from kwargs
available_loras = [
kwargs[f"lora_{i}"] for i in range(1, number_of_loras + 1) if f"lora_{i}" in kwargs and kwargs[f"lora_{i}"]
]
# Raise an error if no Loras are available
if not available_loras:
raise ValueError("No Loras selected")
# Randomly select a Lora
selected_lora = random.choice(available_loras)
# Get the Lora name (without folders or extensions)
lora_name = os.path.splitext(os.path.basename(selected_lora))[0]
# Get the full path to the selected Lora
lora_path = get_full_path("loras", selected_lora)
# Get the folder name where the Lora is located
lora_folder = os.path.basename(os.path.dirname(lora_path))
# Load the Lora file
lora = comfy.utils.load_torch_file(lora_path, safe_load=True)
# Apply the Lora
model_lora, clip_lora = comfy.sd.load_lora_for_models(model, clip, lora, strength_model, strength_clip)
return (model_lora, clip_lora, lora_path, lora_name, lora_folder)

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 862 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 182 KiB

After

Width:  |  Height:  |  Size: 200 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 140 KiB

View File

@@ -0,0 +1,112 @@
import { app } from "../../../scripts/app.js";
app.registerExtension({
name: "Bjornulf.LoopLoraSelector",
async nodeCreated(node) {
if (node.comfyClass === "Bjornulf_LoopLoraSelector") {
const updateLoraInputs = () => {
const numLorasWidget = node.widgets.find(w => w.name === "number_of_loras");
if (!numLorasWidget) return;
const numLoras = numLorasWidget.value;
const loraList = node.widgets.find(w => w.name === "lora_1").options.values;
// Remove excess lora widgets and their corresponding strength widgets
node.widgets = node.widgets.filter(w =>
!w.name.startsWith("lora_") &&
!w.name.startsWith("strength_model_") &&
!w.name.startsWith("strength_clip_") ||
parseInt(w.name.split("_").pop()) <= numLoras
);
// Add new lora widgets and their corresponding strength widgets if needed
for (let i = 1; i <= numLoras; i++) {
const loraWidgetName = `lora_${i}`;
const strengthModelWidgetName = `strength_model_${i}`;
const strengthClipWidgetName = `strength_clip_${i}`;
if (!node.widgets.find(w => w.name === loraWidgetName)) {
const defaultIndex = Math.min(i - 1, loraList.length - 1);
node.addWidget("combo", loraWidgetName, loraList[defaultIndex], () => {}, {
values: loraList
});
}
if (!node.widgets.find(w => w.name === strengthModelWidgetName)) {
node.addWidget("number", strengthModelWidgetName, 1.0, () => {}, {
min: -100.0, max: 100.0, step: 0.01
});
}
if (!node.widgets.find(w => w.name === strengthClipWidgetName)) {
node.addWidget("number", strengthClipWidgetName, 1.0, () => {}, {
min: -100.0, max: 100.0, step: 0.01
});
}
}
// Reorder widgets
const orderedWidgets = [node.widgets.find(w => w.name === "number_of_loras")];
for (let i = 1; i <= numLoras; i++) {
orderedWidgets.push(
node.widgets.find(w => w.name === `lora_${i}`),
node.widgets.find(w => w.name === `strength_model_${i}`),
node.widgets.find(w => w.name === `strength_clip_${i}`)
);
}
orderedWidgets.push(...node.widgets.filter(w => !orderedWidgets.includes(w)));
node.widgets = orderedWidgets.filter(w => w !== undefined);
node.setSize(node.computeSize());
};
// Set up number_of_loras widget
const numLorasWidget = node.widgets.find(w => w.name === "number_of_loras");
if (numLorasWidget) {
numLorasWidget.callback = () => {
updateLoraInputs();
app.graph.setDirtyCanvas(true);
};
}
// Handle deserialization
const originalOnConfigure = node.onConfigure;
node.onConfigure = function(info) {
if (originalOnConfigure) {
originalOnConfigure.call(this, info);
}
// Restore lora widgets and strength widgets based on saved properties
const savedProperties = info.properties;
if (savedProperties) {
Object.keys(savedProperties).forEach(key => {
if (key.startsWith("lora_") || key.startsWith("strength_model_") || key.startsWith("strength_clip_")) {
const widgetName = key;
const widgetValue = savedProperties[key];
const existingWidget = node.widgets.find(w => w.name === widgetName);
if (existingWidget) {
existingWidget.value = widgetValue;
} else {
if (key.startsWith("lora_")) {
node.addWidget("combo", widgetName, widgetValue, () => {}, {
values: node.widgets.find(w => w.name === "lora_1").options.values
});
} else {
node.addWidget("number", widgetName, widgetValue, () => {}, {
min: -100.0, max: 100.0, step: 0.01
});
}
}
}
});
}
// Update lora inputs after restoring saved state
updateLoraInputs();
};
// Initial update
updateLoraInputs();
}
}
});

View File

@@ -0,0 +1,98 @@
import { app } from "../../../scripts/app.js";
app.registerExtension({
name: "Bjornulf.RandomLoraSelector",
async nodeCreated(node) {
if (node.comfyClass === "Bjornulf_RandomLoraSelector") {
const updateLoraInputs = () => {
const numLorasWidget = node.widgets.find(w => w.name === "number_of_loras");
if (!numLorasWidget) return;
const numLoras = numLorasWidget.value;
const loraList = node.widgets.find(w => w.name === "lora_1").options.values;
// Remove excess lora widgets
node.widgets = node.widgets.filter(w => !w.name.startsWith("lora_") || parseInt(w.name.split("_")[1]) <= numLoras);
// Add new lora widgets if needed
for (let i = 1; i <= numLoras; i++) {
const widgetName = `lora_${i}`;
if (!node.widgets.find(w => w.name === widgetName)) {
const defaultIndex = Math.min(i - 1, loraList.length - 1);
node.addWidget("combo", widgetName, loraList[defaultIndex], () => {}, {
values: loraList
});
}
}
// Reorder widgets
node.widgets.sort((a, b) => {
if (a.name === "number_of_loras") return -1;
if (b.name === "number_of_loras") return 1;
if (a.name === "seed") return 1;
if (b.name === "seed") return -1;
if (a.name.startsWith("lora_") && b.name.startsWith("lora_")) {
return parseInt(a.name.split("_")[1]) - parseInt(b.name.split("_")[1]);
}
return a.name.localeCompare(b.name);
});
node.setSize(node.computeSize());
};
// Set up number_of_loras widget
const numLorasWidget = node.widgets.find(w => w.name === "number_of_loras");
if (numLorasWidget) {
numLorasWidget.callback = () => {
updateLoraInputs();
app.graph.setDirtyCanvas(true);
};
}
// Set seed widget to integer input
const seedWidget = node.widgets.find((w) => w.name === "seed");
if (seedWidget) {
seedWidget.type = "HIDDEN"; // Hide seed widget after restoring saved state
}
// Handle deserialization
const originalOnConfigure = node.onConfigure;
node.onConfigure = function(info) {
if (originalOnConfigure) {
originalOnConfigure.call(this, info);
}
// Restore lora widgets based on saved properties
const savedProperties = info.properties;
if (savedProperties) {
Object.keys(savedProperties).forEach(key => {
if (key.startsWith("lora_")) {
const widgetName = key;
const widgetValue = savedProperties[key];
const existingWidget = node.widgets.find(w => w.name === widgetName);
if (existingWidget) {
existingWidget.value = widgetValue;
} else {
node.addWidget("combo", widgetName, widgetValue, () => {}, {
values: node.widgets.find(w => w.name === "lora_1").options.values
});
}
}
});
}
// Ensure seed is a valid integer
const seedWidget = node.widgets.find(w => w.name === "seed");
if (seedWidget && isNaN(parseInt(seedWidget.value))) {
seedWidget.value = 0; // Set a default value if invalid
}
// Update lora inputs after restoring saved state
updateLoraInputs();
};
// Initial update
updateLoraInputs();
}
}
});