mirror of
https://github.com/justUmen/Bjornulf_custom_nodes.git
synced 2026-03-21 20:52:11 -03:00
add requirements.txt
This commit is contained in:
16
README.md
16
README.md
@@ -1,5 +1,21 @@
|
||||
# 🔗 Comfyui : Bjornulf_custom_nodes v0.21 🔗
|
||||
|
||||
# ☁ Usage in cloud :
|
||||
|
||||
If you want to use my nodes and comfyui in the cloud, I'm managing an optimized template on runpod : <https://runpod.io/console/deploy?template=r32dtr35u1&ref=tkowk7g5>
|
||||
Template name : `bjornulf-comfyui-allin-workspace`, can be operational in ~3 minutes. (Depending on your pod)
|
||||
⚠️ You need to open a terminal in browser (After clicking on `connect` from your pod) and use this to launch ComfyUI : `cd /workspace/ComfyUI && python main.py --listen 0.0.0.0 --port 3000` (Much better to control it with a terminal, check logs, etc...)
|
||||
After that you can just click on the `Connect to port 3000` button.
|
||||
If you have any issues with it, please let me know.
|
||||
You need to create and select a network volume, size is up to you, i have 50Gb Storage because i use cloud only for Flux or lora training on a 4090. (~0.7$/hour)
|
||||
It will manage everything in Runpod network storage (`/workspace/ComfyUI`), so you can stop and start the cloud GPU without losing anything, change GPU or whatever.
|
||||
Zone : I recommend `EU-RO-1`, but up to you.
|
||||
Top-up your Runpod account with minimum 10$ to start.
|
||||
⚠️ Warning, you will pay by the minute, so not recommended for testing or learning comfyui. Do that locally !!!
|
||||
Run cloud GPU only when you already have your workflow ready to run.
|
||||
Advice : take a cheap GPU for testing, downloading models or so one.
|
||||
|
||||
|
||||
# Dependencies
|
||||
|
||||
- `pip install ollama` (you can also install ollama if you want : https://ollama.com/download) - You don't need to really install it if you don't want to use my ollama node. (BUT you need to run `pip install ollama`)
|
||||
|
||||
Reference in New Issue
Block a user