[Sammelthread] AI - Bildgenerierung (Stable Diffusion, Midjourney & Co)

zu Ace Step noch ein paar Infos:

# ACE-Step 1.5 LoRA Master-Guide (Part 1/3) by Moonspell & AI Brother
## Strategic Planning & Advanced Audio Data Preparation

### 1. The Mindset: "Human-Centered Generation"
Before you begin, you must understand how ACE-Step "thinks." Unlike closed-source platforms like Suno or Udio, this model is designed for a collaborative relationship. A LoRA is your tool to teach the model your specific aesthetic. Because ACE-Step is open-source, this knowledge and the resulting model belong to you forever—free from platform risks or changing terms of service.

### 2. Technical Audio Preparation (The "Golden Source")
The ACE-Step source code (handler.py) reveals that the model processes audio data using a function called torch.clamp(audio, -1.0, 1.0). This is a critical detail: any audio louder than 0 dB will be radically "clipped" (cut off).

  • Loudness Normalization (LUFS): Your training songs should have a consistent "perceived loudness." The target value is -14 LUFS (the streaming standard for Spotify/YouTube). Use tools like Audacity, a DAW, or ffmpeg to achieve this.
  • Peak Levels: Set your True Peak to -1.0 dB. This prevents the AI's internal clamp function from distorting your music during the latent encoding process.
  • Sample Rate: ACE-Step works internally with 48,000 Hz Stereo. While the tool can convert files automatically, it is best for the purity of the "Latents" (the mathematical form of the audio) if you provide your source files as 48kHz WAV (24-bit or 32-bit float).
  • Cleanup: Remove long periods of silence at the beginning and end of your tracks. Every second of training costs compute time. If a song starts with 5 seconds of silence, the LoRA will learn that "silence" is a feature of your style.

### 3. The Automation Hack: Directory Structure & Lyrics
The tool (dataset_builder.py) actively scans for specific file patterns. Utilizing these will save you hours of manual entry:

  • File Naming: Avoid spaces and special characters. Use a format like my_style_01.wav.
  • Accompanying Lyrics (.txt): If you are creating a LoRA for a voice or a style with vocals, create a .txt file with the EXACT same name as each .wav file.
* Example: retro_vibe_01.wav and retro_vibe_01.txt.
* Content of the .txt: Paste your lyrics and use structure tags like [Verse], [Chorus], and [Bridge]. The code recognizes these tags and helps the LoRA understand the dynamic shifts between different song sections.
* The CSV Metadata Hack: Create a file named metadata.csv in the same folder. The code (_load_csv_metadata) looks for the following columns:
* File: The filename (e.g., retro_vibe_01.wav).
* BPM: The exact tempo (Crucial for the LoRA's rhythmic stability).
* Key: The musical key (e.g., C Major, Am). The tool also understands Camelot values (e.g., 8A).
* Caption: You can pre-write your style descriptions here.

### 4. Activation Tag Strategy (The Trigger Word)
Your Activation Tag is the anchor point. Choose it wisely based on your goals:

  • Uniqueness: Choose a word like ZX_SynthWave instead of just Synthwave. The latter is already part of the base model's knowledge. A unique tag ensures the AI accesses only your new training data when triggered.
  • Positioning:
* Prepend: Your LoRA is treated as an "addition" to existing styles. Ideal for training a specific voice or a signature instrument.
* Replace: Your LoRA takes full control over the semantics. Highly recommended if you are creating a completely new genre that does not exist in the base model.

### 5. Training Folder Checklist
Before moving to the next step (the UI), your folder should look like this:
1. 8 to 20 WAV files (Normalized to -14 LUFS, Peak -1dB, 48kHz).
2. Matching .txt files for every vocal track (Content: Lyrics + Structure Tags).
3. A metadata.csv (Optional, but highly recommended for BPM/Key precision).
4. A clearly defined, unique Activation Tag.

---

# ACE-Step 1.5 LoRA Master-Guide (Part 2/3)
## Dataset Builder & Preprocessing: The Mathematical Transformation

Once your folder is perfectly prepared, launch the ACE-Step App (python app.py). Ensure the service is initialized under the "Service Configuration" tab, as we need the model loaded for the Preprocessing step.

### Phase 1: Registering the Dataset in ACE-Step
Navigate to the 🎓 LoRA Training tab -> 📁 Dataset Builder.

* Scan Directory: Enter the path to your prepared music folder and click 🔍 Scan.
* What happens in the background? The tool now links the audio files with your .txt lyrics and reads the metadata.csv. In the table, you will see symbols: 📝 means lyrics were successfully loaded, 🎵 stands for instrumental.
  • Dataset Name: Choose a concise name. This will serve as the filename for your JSON configuration.
  • Custom Activation Tag & Position: Enter your strategically planned trigger word.
* Crucial Note on Position: If you chose "Replace" in Part 1, ACE-Step will use only your tag as the description during training. If you chose "Prepend," it will be placed before the (yet to be generated) style description.

### Phase 2: The "Secret Weapon" – Genre Ratio & Prompt Override
Inside the code (dataset_builder.py), we discovered a function that determines the flexibility of your LoRA: the Genre Ratio Slider.

* Genre Ratio (Recommendation: 30%):
* This value determines how many of your samples will be trained using short Genre Tags (e.g., "Techno, 90s, Hard") instead of the long Caption (e.g., "A driving techno beat with industrial undertones").
* Why 30%? Training exclusively on long captions makes the LoRA "rigid"—it always tries to reproduce the entire complex picture. Short genre tags loosen up the knowledge and make the LoRA more responsive to different prompts in practice.
* Prompt Override (Per Sample): In the preview section, you can decide for each song individually whether it should be treated as "Genre," "Caption," or follow the global "Ratio." Use "Genre" for very repetitive tracks and "Caption" for complex, atmospheric pieces.

### Phase 3: AI Auto-Labeling with the LLM
Even if you provided metadata, the AI needs to understand what it is hearing to map the sound to the text.

  • 🏷️ Auto-Label All: This button activates the Language Model (LM). It "listens" to the audio codes and writes descriptions.
  • Skip BPM/Key/Time Signature: Enable this checkbox if you already provided exact data in your metadata.csv. The AI's BPM detection is good, but your manual (real) values are always superior.
  • Transcribe vs. Format:
* Use Transcribe if you do not have lyric files (the AI will then try to recognize the vocals).
* Use Format if you have your own lyrics but want them formatted into the ACE-Step structure (e.g., with correct Meta-Tags).

### Phase 4: Quality Control (Preview & Edit)
Before we generate tensors, you must spot-check your work:
  • Use the slider to select different samples.
  • Check if the Caption accurately describes the style.
  • Important: Check the Duration field. The code pulls this value directly from the file. If it shows 0.0, there was an error reading the file—that sample should be removed.

### Phase 5: Preprocessing (The Path to the Tensor Folder)
This is the most technically demanding part of the preparation. Here, your music is translated into the language of the GPU.

* What happens during Preprocessing?
1. VAE Encoding: Your audio is converted by the VAE encoder into "Latents" (a highly compressed mathematical representation).
2. Text Embedding: Your captions and lyrics are translated into vectors by the text encoder.
3. Condition Encoding: ACE-Step pre-calculates the "instruction manual" (Encoder Hidden States) for the model.
  • Tensor Output Directory: Create a dedicated folder (e.g., preprocessed_tensors/my_project).
  • Click ⚡ Preprocess: This process will fully utilize your GPU.
* Result: You will get one .pt file per song. These files contain everything the LoRA needs to learn. From this point on, the original audio is no longer needed during the actual training.

---

# ACE-Step 1.5 LoRA Master-Guide (Part 3/3)
## LoRA Training & Deployment: Final Optimization

At this stage, your data is ready as preprocessed .pt files. We now switch to the 🚀 Train LoRA tab. This is the moment when the AI learns to associate your Activation Tag with your music.

### 1. Loading the Tensors
  • Under Preprocessed Tensors Directory, enter the path to the folder created in Part 2.
  • Click 📂 Load Dataset.
  • Technical Check: The tool reads the manifest.json. Look at the info box: if the number of samples is correct and your Custom Tag appears, everything is ready.

### 2. The Fork in the Road: Turbo vs. Base Model
This is the most critical part. ACE-Step 1.5 has two completely different modes of operation. A LoRA trained for Turbo will likely not work on the Base Model and vice versa.

#### Scenario A: Turbo Training (The "Sprint")
Use this for fast generation (8 steps) and creative flexibility.
* Shift: MUST be set to 3.0.
* Why? The code (trainer.py, line 27) uses a specific discrete list (TURBO_SHIFT3_TIMESTEPS). The Turbo model "jumps" through 8 very specific points in time during generation. If you train with Shift 1.0, the LoRA learns information at timestamps that the Turbo model simply skips during generation. The result would be noise or "out-of-tune" artifacts.
  • Inference Steps (Reference): 8.
  • Guidance Scale (Reference): 1.0 (According to the code, Turbo does not use CFG).

#### Scenario B: Base Training (The "Marathon")
Use this for high-fidelity audio, stem separation, and maximum control.
* Shift: 1.0.
* Why? The Base model learns linearly. A Shift of 1.0 distributes the AI's attention evenly between the broad structure and fine details (timbre, texture).
  • Inference Steps (Reference): 32 to 50.
  • Guidance Scale (Reference): 3.5 to 7.0.

### 3. Hyperparameters (The Control Dials)

| Parameter | Recommendation | The "Why" (Code Insight) |
| :--- | :--- | :--- |
| LoRA Rank (r) | 64 | Rank determines capacity. 64 is the "sweet spot." Higher values (128+) can store more detail but consume significantly more VRAM and are prone to "rote memorization" (overfitting). |
| LoRA Alpha | 128 | Alpha scales the learned weights. Stick to the 2:1 ratio (Alpha = 2x Rank). According to lora_utils.py, this stabilizes the gradient flow and prevents audio distortion (clipping). |
| Learning Rate | 1e-4 to 3e-4 | Start with 3e-4 for Turbo, 1e-4 for Base. If the rate is too high, the "Loss" (error rate) will explode. If it's too low, the AI will ignore your Activation Tag. |
| Max Epochs | 500 - 1000 | For 10-20 songs, 500-1000 epochs are ideal. The AI needs to "hear" the songs enough times to extract the essence of your style. |
| Batch Size | 1 | Stay at 1 to save VRAM. Thanks to preprocessing, training is extremely fast even with Batch 1. |

### 4. Monitoring the Training Process
Click 🚀 Start Training. The trainer will now start writing logs.
  • The Loss Plot: Watch the curve. It should go down. It’s normal for it to fluctuate slightly, but the overall trend must be downward.
  • The Log: The code uses Lightning Fabric for bf16-mixed precision. This means training is highly efficient.
  • Stopping Training: If the curve drops to 0 or shoots to infinity (NaN), stop immediately. This usually means your data is corrupted (e.g., clipping, see Part 1) or the Learning Rate is much too high.

### 5. Export & Integration
Once training is finished (Status: Training completed), the LoRA is stored in the lora_output/final folder.
  • Enter a name under Export Path (e.g., ./checkpoints/my_custom_lora) and click 📦 Export LoRA.
  • The tool copies the adapter_model.safetensors and the configuration files to the destination.

### 6. Using the LoRA in Practice
1. Go to the Service Configuration tab.
2. Under LoRA Path, select your exported folder.
3. Click Load LoRA.
4. The Ultimate Test:
* Select the exact same Shift value you used during training (3.0 for Turbo, 1.0 for Base).
* Set the Inference Steps correctly (8 for Turbo, 50 for Base).
* Write your Activation Tag at the beginning of the prompt.
* Tip: If the LoRA feels too "weak," you can adjust the LM Codes Strength in the Advanced Settings or increase the LoRA strength once the UI supports it.

### Final Pro-Tips:
  • Reproducibility: Use a fixed Seed when testing. This allows you to see exactly how the LoRA modifies the song without the "noise" of random variation.
  • Iterative Training: If the style isn't being captured, check your captions in Part 2. Often, a description that is too vague is the reason why the AI cannot cleanly separate the style from the Activation Tag.
  • Backups: Save your .pt tensors. If a new version of ACE-Step is released, you can retrain the LoRA in minutes without repeating the time-consuming scan process.

Noch ein System Prompt
 
Ich muss sagen, dass mir Flux2 Klein nicht besonders gefällt. Die Geschichte mit 3 Beinen und 6 Armen hatten wir eigentlich schon hinter uns.
Flux2-Klein_00001_.png
 
Hab mir auch mal ACE Heruntergeladen und die Demo generiert.

Ich hör tatsächlich nicht viel Musik. Musik ist schön, ich kann Musik auch genießen. Aber das ist irgendwie ein aktiver Akt, an den ich mich erinnern muss. Ich muss wohl leider sagen, das Musik eher nicht teil meines leben ist. So viel zu meinem Hintergrund.

Auf meinen Boxen, die vielleicht untere Mittelklasse Stereoboxen sind (also nicht dieser Pseudo Surround Doby-Atmos bla), hört sich die Musik brauchbar an. Kann man machen.

Auf meinen Studio Kopfhörern hört sich die Musik grausam an! Ich würde vielleicht nicht erkennen, ob das eine KI generiert hat, aber ich würde definitiv erkennen, das es sich um Musik handelt, die verdammt schlecht aufgenommen und abgemischt wurde.

Nun ist mir auch klar, wie Menschen Musik konsumieren. KI wird hier wohl vieles übernehmen...

Keuleman schrieb:
Ich muss sagen, dass mir Flux2 Klein nicht besonders gefällt.
War auch mein erster Eindruck. Ich lag komplett falsch! Ich nutze Flux 2 inzwischen gerne um Bilder zu bearbeiten. Auch um Bilder von anderen KIs aufzuwerten. Dann bringt Flux den Realitäts-Touch mit, den andere Modelle vermissen lassen. Gesichter und Hände kann man damit wunderbar fixen. Da wir kein anderen SDXL Workflow mehr herankommen.

Aber ja, jedes Modell hat stärken und schwächen.
 
Zuletzt bearbeitet:
@Meta.Morph Interessant, dass du das so hörst. Ich höre es anders. Ich höre Musik aus einem 0 Euro Modell, die einen schönen Flow hat zum im Hintergrund laufen lassen.

Edit: schon interessant, wie unterschiedlich Musik erlebt werden kann!
 
Meta.Morph schrieb:
Es ist einfach schlecht.
Wie 90% der aktuellen "Songs" im Radio. Ich vermute kaum einer würde einen großen Unterschied merken.
 
  • Gefällt mir
Reaktionen: Meta.Morph
Das ist es ja. Eh alles per autotune gerade gebügelt. Kein großer Unterschied zur KI.
 
  • Gefällt mir
Reaktionen: Meta.Morph
Keuleman schrieb:
Ich muss sagen, dass mir Flux2 Klein nicht besonders gefällt. Die Geschichte mit 3 Beinen und 6 Armen hatten wir eigentlich schon hinter uns.
Also ich finde die Gesamtqualität und Fähigkeiten eines Modells wichtiger als solche Details, die zu reparieren, ja auch Spaß machen kann (jedenfalls empfinde ich das als spannend, die richtigen Settings zu finden, um einen oder mehrere Fehler in einem ansonsten sehr schönen Bild zu reparieren und es damit zur Perfektion zu bringen). Ist doch langweilig, wenn direkt alles perfekt ist, das macht doch auch einen Teil vom Bild-Ki Handwerk aus, etwas mit eigener Anstrengung noch auf ein höheres Level zu bringen ;)

Bei Deinem Bild würde ich mir anstatt um das dritte Bein oder die Finger weniger Gedanken machen, diese sind ja schnell zu reparieren, aber es finden sich Artefakte in Form von kleinen grünen Punkten im Bild, deren Herkunft zu klären und dann zu verhindern wären (könnte mit Steps/Sampler oder LORA zusammenhängen).
Außerdem provozieren bestimmte Auflösungen/Aspec Ratio auch eher solche Dinge, da gilt es dann auch ggf. erstmal in einer etwas niedrigeren Auflösung oder passenden Ratio zu generieren und dann gibts ja Upscale/Outpainting usw. :)
 
Kleine grüne Punkte sehe ich tatsächlich nicht (mag aufgrund Alter aber auch blind sein :-D ). Aber ja klar, kann man reparieren. EDIT: halt, jetzt. Beim rein zoomen. Vielleicht zu viele Steps? Oder zu kleines Modell genommen? Siehe unten...

Habe die Probleme bei zum Beispiel Z-Image nicht. Gehe da auch direkt auf 1080p.

Du hast recht, fummeln an den Settings ist Teil des Spaßes. Vielleicht teste ich gelegentlich noch mal weiter an Flux 2 Klein rum.
Momentan hänge ich aber eher bei Ace Step und LTX2 ab :-D
 
So, rumprobiert: Hatte ein zu kleines 4b fp8 Modell von Flux Klein (halt wirklich klein aber dadurch auch ein Bein mehr). Die grünen Pünktchen sind nur in den Schattenbereichen. Durch Änderung an CFG, Steps oder Sampler nicht beizubekommen, auch der Wechsel des Modells selbst hat keine Änderung herbei geführt. Auch eine Änderung der Auflösung oder ein gezieltes Prompten auf "bitte keine grünen Pünktchen" (no green speckles, no artifacts, clean shadows, smooth dark areas) brachte keine Änderung am Output. Ist einfach ein modellbedingtes Problem. Normales Flux 2 - keine grünen Pünktchen in schattigen Gebieten :-)
 
Ok mhh schick gerne mal Deinen exakten Workflow von diesem Bild.
 
Schon mal rein geworfen: Das nächste Text to Video mit Audio Modell ist gelandet, Vidu Q3.

Ansonsten anbei die wichtigsten Sachen aus dem Bild Workflow. Direkt Workflow hier hochladen geht nicht... oder warte... noch eine Idee.

{"id":"92112d97-bb64-4b44-86f2-ea5691ef8f6e","revision":0,"last_node_id":83,"last_link_id":159,"nodes":[{"id":9,"type":"SaveImage","pos":[730.3333996968714,341.33332860407205],"size":[520,550],"flags":{},"order":5,"mode":0,"inputs":[{"localized_name":"images","name":"images","type":"IMAGE","link":154},{"localized_name":"filename_prefix","name":"filename_prefix","type":"STRING","widget":{"name":"filename_prefix"},"link":null}],"outputs":[],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"SaveImage","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":["Flux2-Klein"]},{"id":78,"type":"SaveImage","pos":[740,950],"size":[520,550],"flags":{},"order":6,"mode":4,"inputs":[{"localized_name":"images","name":"images","type":"IMAGE","link":156},{"localized_name":"filename_prefix","name":"filename_prefix","type":"STRING","widget":{"name":"filename_prefix"},"link":null}],"outputs":[],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"SaveImage","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":["Flux2-Klein"]},{"id":79,"type":"MarkdownNote","pos":[-710,340],"size":[550,530],"flags":{},"order":0,"mode":0,"inputs":[],"outputs":[],"properties":{},"widgets_values":'Guide: [Subgraph\n\n## Model links (for local users)\n\n*diffusion_models*\n\n- flux-2-klein-base-9b-fp8.safetensors\n- flux-2-klein-9b-fp8.safetensors\n\n> Please visit BFL's repo, accept the agreement in the repo, and then download the models.\n\n*text_encoders*\n\n- qwen_3_8b_fp8mixed.safetensors\n\n*vae*\n\n- flux2-vae.safetensors\n\n\nModel Storage Location\n\n`\n📂 ComfyUI/\n├── 📂 models/\n│ ├── 📂 diffusion_models/\n│ │ ├── flux-2-klein-9b-fp8.safetensors\n│ │ └── flux-2-klein-base-9b-fp8.safetensors\n│ ├── 📂 text_encoders/\n│ │ └── qwen_3_8b_fp8mixed.safetensors\n│ └── 📂 vae/\n│ └── flux2-vae.safetensors\n`\n## Report issue\n\nNote: please update ComfyUI first (guide) and prepare required models. Desktop/Cloud ship stable builds; nightly-supported models may not be included yet, please wait for the next stable release.\n\n- Cannot run / runtime errors: ComfyUI/issues\n- UI / frontend issues: ComfyUI_frontend/issues\n- Workflow issues: workflow_templates/issues\n\n"],"color":"#222","bgcolor":"#000"},{"id":83,"type":"MarkdownNote","pos":[480,200],"size":[210,88],"flags":{},"order":1,"mode":0,"inputs":[],"outputs":[],"title":"Note","properties":{},"widgets_values":'Click on the top-right corner of the node to open the [subgraph \n👇"],"color":"#222","bgcolor":"#000"},{"id":77,"type":"a67caa28-5f85-4917-8396-36004960dd30","pos":[310,950],"size":[400,415],"flags":{},"order":4,"mode":4,"inputs":[{"label":"width","name":"value","type":"INT","widget":{"name":"value"},"link":null},{"label":"height","name":"value_1","type":"INT","widget":{"name":"value_1"},"link":null},{"name":"noise_seed","type":"INT","widget":{"name":"noise_seed"},"link":null},{"name":"unet_name","type":"COMBO","widget":{"name":"unet_name"},"link":null},{"name":"clip_name","type":"COMBO","widget":{"name":"clip_name"},"link":null},{"name":"vae_name","type":"COMBO","widget":{"name":"vae_name"},"link":null},{"label":"prompt","name":"text","type":"STRING","widget":{"name":"text"},"link":157}],"outputs":[{"localized_name":"IMAGE","name":"IMAGE","type":"IMAGE","links":[156]}],"properties":{"proxyWidgets":[["-1","text"],["-1","value"],["-1","value_1"],["-1","noise_seed"],["-1","unet_name"],["-1","clip_name"],["-1","vae_name"]],"cnr_id":"comfy-core","ver":"0.8.2","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":["",1024,1024,432262096973490,"flux-2-klein-9b-fp8.safetensors","qwen_3_8b_fp8mixed.safetensors","flux2-vae.safetensors"]},{"id":76,"type":"PrimitiveStringMultiline","pos":[-140,340],"size":[400,200],"flags":{},"order":2,"mode":0,"inputs":[{"localized_name":"value","name":"value","type":"STRING","widget":{"name":"value"},"link":null}],"outputs":[{"localized_name":"STRING","name":"STRING","type":"STRING","links":[155,157]}],"title":"Prompt","properties":{"cnr_id":"comfy-core","ver":"0.9.1","Node name for S&R":"PrimitiveStringMultiline","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":["Diablo 1 Today – Archer entering a dungeon, monsters noticing her from the shadows.\n\nCharacter:\n\nsame female archer, alert, bow ready\n\nslightly tense, stepping cautiously\n\nMonsters:\n\ndemonic silhouettes peeking from dark corners\n\nglowing eyes, snarling mouths\n\nsubtle mist or fog partially hiding them\n\nsmall horde just out of reach, sense of lurking threat\n\nEnvironment:\n\ngothic dungeon / ruined courtyard\n\ndim torchlight, shadows flickering across walls\n\ndebris, bones, small puddles reflecting light\n\nAction & Composition:\n\nhero-shot angle: archer foreground, monsters mid-ground\n\nslight low-angle for monster menace\n\ncinematic lighting highlights archer, monsters partially in shadow\n\nMood & Tone:\n\ntense, foreboding, imminent danger\n\nclassic Diablo 1 humor in menace: monsters\n\nultra-realistic textures: stone, moss, armor, monster skin"]},{"id":75,"type":"7b34ab90-36f9-45ba-a665-71d418f0df18","pos":[300.33338346244335,340.1642581807214],"size":[400,416.16907042335066],"flags":{},"order":3,"mode":0,"inputs":[{"label":"width","name":"value","type":"INT","widget":{"name":"value"},"link":null},{"label":"height","name":"value_1","type":"INT","widget":{"name":"value_1"},"link":null},{"name":"noise_seed","type":"INT","widget":{"name":"noise_seed"},"link":null},{"name":"unet_name","type":"COMBO","widget":{"name":"unet_name"},"link":null},{"name":"clip_name","type":"COMBO","widget":{"name":"clip_name"},"link":null},{"name":"vae_name","type":"COMBO","widget":{"name":"vae_name"},"link":null},{"label":"prompt","name":"text","type":"STRING","widget":{"name":"text"},"link":155}],"outputs":[{"localized_name":"IMAGE","name":"IMAGE","type":"IMAGE","links":[154]}],"properties":{"proxyWidgets":[["-1","text"],["-1","value"],["-1","value_1"],["-1","noise_seed"],["-1","unet_name"],["-1","clip_name"],["-1","vae_name"]],"cnr_id":"comfy-core","ver":"0.8.2","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":["",1920,1080,432262096973490,"flux-2-klein-base-9b.safetensors","qwen_3_8b_fp8mixed.safetensors","flux2-vae.safetensors"]}],"links":[[154,75,0,9,0,"IMAGE"],[155,76,0,75,6,"STRING"],[156,77,0,78,0,"IMAGE"],[157,76,0,77,6,"STRING"]],"groups":[],"definitions":{"subgraphs":[{"id":"7b34ab90-36f9-45ba-a665-71d418f0df18","version":1,"state":{"lastGroupId":4,"lastNodeId":75,"lastLinkId":162,"lastRerouteId":0},"revision":0,"config":{},"name":"Text to Image (Flux.2 Klein 9B)","inputNode":{"id":-10,"bounding":[-654.6666797319973,583.833333604156,120,180]},"outputNode":{"id":-20,"bounding":[1295.3333202680028,573.833333604156,120,60]},"inputs":[{"id":"765f2681-7227-4d56-b48b-73a758973d7c","name":"value","type":"INT","linkIds":[155],"label":"width","pos":[-554.6666797319973,603.833333604156]},{"id":"6527c96e-0ca0-4afe-bd2f-149a80412736","name":"value_1","type":"INT","linkIds":[157],"label":"height","pos":[-554.6666797319973,623.833333604156]},{"id":"88ef7204-7170-4353-abd9-165480205784","name":"noise_seed","type":"INT","linkIds":[158],"pos":[-554.6666797319973,643.833333604156]},{"id":"ae713dbf-52cb-471a-807d-b12fa9acbb48","name":"unet_name","type":"COMBO","linkIds":[159],"pos":[-554.6666797319973,663.833333604156]},{"id":"f29f705e-e43a-4502-b231-a1eea93a1d69","name":"clip_name","type":"COMBO","linkIds":[160],"pos":[-554.6666797319973,683.833333604156]},{"id":"a954f490-386b-4367-9ea1-1b65913963a5","name":"vae_name","type":"COMBO","linkIds":[161],"pos":[-554.6666797319973,703.833333604156]},{"id":"7061147a-fb75-450d-8e97-c8be594a8e16","name":"text","type":"STRING","linkIds":[162],"label":"prompt","pos":[-554.6666797319973,723.833333604156]}],"outputs":[{"id":"c5e7966d-07ed-4c9a-ad89-9d378a41ea7b","name":"IMAGE","type":"IMAGE","linkIds":[153],"localized_name":"IMAGE","pos":[1315.3333202680028,593.833333604156]}],"widgets":[],"nodes":[{"id":61,"type":"KSamplerSelect","pos":[461.3333202680027,468.83333360415605],"size":[270,58],"flags":{},"order":0,"mode":0,"inputs":[{"localized_name":"sampler_name","name":"sampler_name","type":"COMBO","widget":{"name":"sampler_name"},"link":null}],"outputs":[{"localized_name":"SAMPLER","name":"SAMPLER","type":"SAMPLER","links":[144]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"KSamplerSelect","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":["euler"]},{"id":62,"type":"Flux2Scheduler","pos":[461.3333202680027,568.833333604156],"size":[270,106],"flags":{},"order":1,"mode":0,"inputs":[{"localized_name":"steps","name":"steps","type":"INT","widget":{"name":"steps"},"link":null},{"localized_name":"width","name":"width","type":"INT","widget":{"name":"width"},"link":137},{"localized_name":"height","name":"height","type":"INT","widget":{"name":"height"},"link":138}],"outputs":[{"localized_name":"SIGMAS","name":"SIGMAS","type":"SIGMAS","links":[145]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"Flux2Scheduler","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[20,1024,1024]},{"id":64,"type":"SamplerCustomAdvanced","pos":[761.3333202680027,248.83333360415605],"size":[212.38333740234376,106],"flags":{},"order":3,"mode":0,"inputs":[{"localized_name":"noise","name":"noise","type":"NOISE","link":142},{"localized_name":"guider","name":"guider","type":"GUIDER","link":143},{"localized_name":"sampler","name":"sampler","type":"SAMPLER","link":144},{"localized_name":"sigmas","name":"sigmas","type":"SIGMAS","link":145},{"localized_name":"latent_image","name":"latent_image","type":"LATENT","link":146}],"outputs":[{"localized_name":"output","name":"output","type":"LATENT","links":[147]},{"localized_name":"denoised_output","name":"denoised_output","type":"LATENT","links":[]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"SamplerCustomAdvanced","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[]},{"id":65,"type":"VAEDecode","pos":[1015.3333202680027,178.83333360415605],"size":[220,46],"flags":{},"order":4,"mode":0,"inputs":[{"localized_name":"samples","name":"samples","type":"LATENT","link":147},{"localized_name":"vae","name":"vae","type":"VAE","link":148}],"outputs":[{"localized_name":"IMAGE","name":"IMAGE","type":"IMAGE","slot_index":0,"links":[153]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"VAEDecode","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[]},{"id":66,"type":"EmptyFlux2LatentImage","pos":[455.3333202680027,828.833333604156],"size":[270,106],"flags":{},"order":5,"mode":0,"inputs":[{"localized_name":"width","name":"width","type":"INT","widget":{"name":"width"},"link":149},{"localized_name":"height","name":"height","type":"INT","widget":{"name":"height"},"link":150},{"localized_name":"batch_size","name":"batch_size","type":"INT","widget":{"name":"batch_size"},"link":null}],"outputs":[{"localized_name":"LATENT","name":"LATENT","type":"LATENT","links":[146]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"EmptyFlux2LatentImage","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[1024,1024,1]},{"id":67,"type":"CLIPTextEncode","pos":[-34.666679731997306,558.833333604156],"size":[430,100],"flags":{},"order":6,"mode":0,"inputs":[{"localized_name":"clip","name":"clip","type":"CLIP","link":152},{"localized_name":"text","name":"text","type":"STRING","widget":{"name":"text"},"link":null}],"outputs":[{"localized_name":"CONDITIONING","name":"CONDITIONING","type":"CONDITIONING","slot_index":0,"links":[141]}],"title":"CLIP Text Encode (Negative Prompt)","properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"CLIPTextEncode","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[""],"color":"#322","bgcolor":"#533"},{"id":68,"type":"PrimitiveInt","pos":[-414.6666797319973,808.833333604156],"size":[270,82],"flags":{},"order":7,"mode":0,"inputs":[{"localized_name":"value","name":"value","type":"INT","widget":{"name":"value"},"link":155}],"outputs":[{"localized_name":"INT","name":"INT","type":"INT","links":[137,149]}],"title":"Width","properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"PrimitiveInt","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[1920,"fixed"]},{"id":69,"type":"PrimitiveInt","pos":[-414.6666797319973,958.833333604156],"size":[270,82],"flags":{},"order":8,"mode":0,"inputs":[{"localized_name":"value","name":"value","type":"INT","widget":{"name":"value"},"link":157}],"outputs":[{"localized_name":"INT","name":"INT","type":"INT","links":[138,150]}],"title":"Height","properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"PrimitiveInt","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[1080,"fixed"]},{"id":73,"type":"RandomNoise","pos":[461.3333202680027,208.83333360415605],"size":[270,82],"flags":{},"order":12,"mode":0,"inputs":[{"localized_name":"noise_seed","name":"noise_seed","type":"INT","widget":{"name":"noise_seed"},"link":158}],"outputs":[{"localized_name":"NOISE","name":"NOISE","type":"NOISE","links":[142]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"RandomNoise","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[432262096973490,"randomize"]},{"id":72,"type":"VAELoader","pos":[-464.6666797319973,548.6666840360392],"size":[364.42708333333337,58],"flags":{},"order":11,"mode":0,"inputs":[{"localized_name":"vae_name","name":"vae_name","type":"COMBO","widget":{"name":"vae_name"},"link":161}],"outputs":[{"localized_name":"VAE","name":"VAE","type":"VAE","links":[148]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"VAELoader","models":[{"name":"flux2-vae.safetensors","url":"https://huggingface.co/Comfy-Org/flux2-dev/resolve/main/split_files/vae/flux2-vae.safetensors","directory":"vae"}],"enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":["flux2-vae.safetensors"]},{"id":74,"type":"CLIPTextEncode","pos":[-34.666679731997306,228.83333360415605],"size":[430,280],"flags":{},"order":13,"mode":0,"inputs":[{"localized_name":"clip","name":"clip","type":"CLIP","link":151},{"localized_name":"text","name":"text","type":"STRING","widget":{"name":"text"},"link":162}],"outputs":[{"localized_name":"CONDITIONING","name":"CONDITIONING","type":"CONDITIONING","slot_index":0,"links":[140]}],"title":"CLIP Text Encode (Positive Prompt)","properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"CLIPTextEncode","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[""],"color":"#232","bgcolor":"#353"},{"id":71,"type":"CLIPLoader","pos":[-464.6666797319973,378.66668403603927],"size":[364.42708333333337,106],"flags":{},"order":10,"mode":0,"inputs":[{"localized_name":"clip_name","name":"clip_name","type":"COMBO","widget":{"name":"clip_name"},"link":160},{"localized_name":"type","name":"type","type":"COMBO","widget":{"name":"type"},"link":null},{"localized_name":"device","name":"device","shape":7,"type":"COMBO","widget":{"name":"device"},"link":null}],"outputs":[{"localized_name":"CLIP","name":"CLIP","type":"CLIP","links":[151,152]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"CLIPLoader","models":[{"name":"qwen_3_8b_fp8mixed.safetensors","url":"https://huggingface.co/Comfy-Org/fl.../text_encoders/qwen_3_8b_fp8mixed.safetensors","directory":"text_encoders"}],"enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":["qwen_3_8b_fp8mixed.safetensors","flux2","cpu"]},{"id":70,"type":"UNETLoader","pos":[-464.6666797319973,228.66668403603927],"size":[364.42708333333337,82],"flags":{},"order":9,"mode":0,"inputs":[{"localized_name":"unet_name","name":"unet_name","type":"COMBO","widget":{"name":"unet_name"},"link":159},{"localized_name":"weight_dtype","name":"weight_dtype","type":"COMBO","widget":{"name":"weight_dtype"},"link":null}],"outputs":[{"localized_name":"MODEL","name":"MODEL","type":"MODEL","links":[139]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"UNETLoader","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":["flux-2-klein-base-9b.safetensors","default"]},{"id":63,"type":"CFGGuider","pos":[461.3333202680027,328.83333360415605],"size":[270,98],"flags":{},"order":2,"mode":0,"inputs":[{"localized_name":"model","name":"model","type":"MODEL","link":139},{"localized_name":"positive","name":"positive","type":"CONDITIONING","link":140},{"localized_name":"negative","name":"negative","type":"CONDITIONING","link":141},{"localized_name":"cfg","name":"cfg","type":"FLOAT","widget":{"name":"cfg"},"link":null}],"outputs":[{"localized_name":"GUIDER","name":"GUIDER","type":"GUIDER","links":[143]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"CFGGuider","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[4]}],"groups":[{"id":1,"title":"Models","bounding":[-470,140,380,550],"color":"#3f789e","font_size":24,"flags":{}},{"id":2,"title":"Prompt","bounding":[-50,140,470,550],"color":"#3f789e","font_size":24,"flags":{}},{"id":3,"title":"Sampler","bounding":[460,140,532.3638671875,550],"color":"#3f789e","font_size":24,"flags":{}},{"id":4,"title":"Image Size","bounding":[-470,720,380,350],"color":"#3f789e","font_size":24,"flags":{}}],"links":[{"id":137,"origin_id":68,"origin_slot":0,"target_id":62,"target_slot":1,"type":"INT"},{"id":138,"origin_id":69,"origin_slot":0,"target_id":62,"target_slot":2,"type":"INT"},{"id":139,"origin_id":70,"origin_slot":0,"target_id":63,"target_slot":0,"type":"MODEL"},{"id":140,"origin_id":74,"origin_slot":0,"target_id":63,"target_slot":1,"type":"CONDITIONING"},{"id":141,"origin_id":67,"origin_slot":0,"target_id":63,"target_slot":2,"type":"CONDITIONING"},{"id":142,"origin_id":73,"origin_slot":0,"target_id":64,"target_slot":0,"type":"NOISE"},{"id":143,"origin_id":63,"origin_slot":0,"target_id":64,"target_slot":1,"type":"GUIDER"},{"id":144,"origin_id":61,"origin_slot":0,"target_id":64,"target_slot":2,"type":"SAMPLER"},{"id":145,"origin_id":62,"origin_slot":0,"target_id":64,"target_slot":3,"type":"SIGMAS"},{"id":146,"origin_id":66,"origin_slot":0,"target_id":64,"target_slot":4,"type":"LATENT"},{"id":147,"origin_id":64,"origin_slot":0,"target_id":65,"target_slot":0,"type":"LATENT"},{"id":148,"origin_id":72,"origin_slot":0,"target_id":65,"target_slot":1,"type":"VAE"},{"id":149,"origin_id":68,"origin_slot":0,"target_id":66,"target_slot":0,"type":"INT"},{"id":150,"origin_id":69,"origin_slot":0,"target_id":66,"target_slot":1,"type":"INT"},{"id":152,"origin_id":71,"origin_slot":0,"target_id":67,"target_slot":0,"type":"CLIP"},{"id":151,"origin_id":71,"origin_slot":0,"target_id":74,"target_slot":0,"type":"CLIP"},{"id":153,"origin_id":65,"origin_slot":0,"target_id":-20,"target_slot":0,"type":"IMAGE"},{"id":155,"origin_id":-10,"origin_slot":0,"target_id":68,"target_slot":0,"type":"INT"},{"id":157,"origin_id":-10,"origin_slot":1,"target_id":69,"target_slot":0,"type":"INT"},{"id":158,"origin_id":-10,"origin_slot":2,"target_id":73,"target_slot":0,"type":"INT"},{"id":159,"origin_id":-10,"origin_slot":3,"target_id":70,"target_slot":0,"type":"COMBO"},{"id":160,"origin_id":-10,"origin_slot":4,"target_id":71,"target_slot":0,"type":"COMBO"},{"id":161,"origin_id":-10,"origin_slot":5,"target_id":72,"target_slot":0,"type":"COMBO"},{"id":162,"origin_id":-10,"origin_slot":6,"target_id":74,"target_slot":1,"type":"STRING"}],"extra":{"workflowRendererVersion":"LG"}},{"id":"a67caa28-5f85-4917-8396-36004960dd30","version":1,"state":{"lastGroupId":4,"lastNodeId":76,"lastLinkId":164,"lastRerouteId":0},"revision":0,"config":{},"name":"Text to Image (Flux.2 Klein 9B Distilled)","inputNode":{"id":-10,"bounding":[-654.6666797319973,583.833333604156,120,180]},"outputNode":{"id":-20,"bounding":[1295.3333202680028,573.833333604156,120,60]},"inputs":[{"id":"765f2681-7227-4d56-b48b-73a758973d7c","name":"value","type":"INT","linkIds":[155],"label":"width","pos":[-554.6666797319973,603.833333604156]},{"id":"6527c96e-0ca0-4afe-bd2f-149a80412736","name":"value_1","type":"INT","linkIds":[157],"label":"height","pos":[-554.6666797319973,623.833333604156]},{"id":"88ef7204-7170-4353-abd9-165480205784","name":"noise_seed","type":"INT","linkIds":[158],"pos":[-554.6666797319973,643.833333604156]},{"id":"ae713dbf-52cb-471a-807d-b12fa9acbb48","name":"unet_name","type":"COMBO","linkIds":[159],"pos":[-554.6666797319973,663.833333604156]},{"id":"f29f705e-e43a-4502-b231-a1eea93a1d69","name":"clip_name","type":"COMBO","linkIds":[160],"pos":[-554.6666797319973,683.833333604156]},{"id":"a954f490-386b-4367-9ea1-1b65913963a5","name":"vae_name","type":"COMBO","linkIds":[161],"pos":[-554.6666797319973,703.833333604156]},{"id":"7061147a-fb75-450d-8e97-c8be594a8e16","name":"text","type":"STRING","linkIds":[162],"label":"prompt","pos":[-554.6666797319973,723.833333604156]}],"outputs":[{"id":"c5e7966d-07ed-4c9a-ad89-9d378a41ea7b","name":"IMAGE","type":"IMAGE","linkIds":[153],"localized_name":"IMAGE","pos":[1315.3333202680028,593.833333604156]}],"widgets":[],"nodes":[{"id":61,"type":"KSamplerSelect","pos":[461.3333202680027,468.83333360415605],"size":[270,58],"flags":{},"order":0,"mode":4,"inputs":[{"localized_name":"sampler_name","name":"sampler_name","type":"COMBO","widget":{"name":"sampler_name"},"link":null}],"outputs":[{"localized_name":"SAMPLER","name":"SAMPLER","type":"SAMPLER","links":[144]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"KSamplerSelect","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":["euler"]},{"id":64,"type":"SamplerCustomAdvanced","pos":[761.3333202680027,248.83333360415605],"size":[212.38333740234376,106],"flags":{},"order":3,"mode":4,"inputs":[{"localized_name":"noise","name":"noise","type":"NOISE","link":142},{"localized_name":"guider","name":"guider","type":"GUIDER","link":143},{"localized_name":"sampler","name":"sampler","type":"SAMPLER","link":144},{"localized_name":"sigmas","name":"sigmas","type":"SIGMAS","link":145},{"localized_name":"latent_image","name":"latent_image","type":"LATENT","link":146}],"outputs":[{"localized_name":"output","name":"output","type":"LATENT","links":[147]},{"localized_name":"denoised_output","name":"denoised_output","type":"LATENT","links":[]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"SamplerCustomAdvanced","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[]},{"id":65,"type":"VAEDecode","pos":[1015.3333202680027,178.83333360415605],"size":[220,46],"flags":{},"order":4,"mode":4,"inputs":[{"localized_name":"samples","name":"samples","type":"LATENT","link":147},{"localized_name":"vae","name":"vae","type":"VAE","link":148}],"outputs":[{"localized_name":"IMAGE","name":"IMAGE","type":"IMAGE","slot_index":0,"links":[153]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"VAEDecode","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[]},{"id":66,"type":"EmptyFlux2LatentImage","pos":[455.3333202680027,828.833333604156],"size":[270,106],"flags":{},"order":5,"mode":4,"inputs":[{"localized_name":"width","name":"width","type":"INT","widget":{"name":"width"},"link":149},{"localized_name":"height","name":"height","type":"INT","widget":{"name":"height"},"link":150},{"localized_name":"batch_size","name":"batch_size","type":"INT","widget":{"name":"batch_size"},"link":null}],"outputs":[{"localized_name":"LATENT","name":"LATENT","type":"LATENT","links":[146]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"EmptyFlux2LatentImage","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[1024,1024,1]},{"id":68,"type":"PrimitiveInt","pos":[-414.6666797319973,808.833333604156],"size":[270,82],"flags":{},"order":6,"mode":4,"inputs":[{"localized_name":"value","name":"value","type":"INT","widget":{"name":"value"},"link":155}],"outputs":[{"localized_name":"INT","name":"INT","type":"INT","links":[137,149]}],"title":"Width","properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"PrimitiveInt","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[1024,"fixed"]},{"id":69,"type":"PrimitiveInt","pos":[-414.6666797319973,958.833333604156],"size":[270,82],"flags":{},"order":7,"mode":4,"inputs":[{"localized_name":"value","name":"value","type":"INT","widget":{"name":"value"},"link":157}],"outputs":[{"localized_name":"INT","name":"INT","type":"INT","links":[138,150]}],"title":"Height","properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"PrimitiveInt","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[1024,"fixed"]},{"id":73,"type":"RandomNoise","pos":[461.3333202680027,208.83333360415605],"size":[270,82],"flags":{},"order":11,"mode":4,"inputs":[{"localized_name":"noise_seed","name":"noise_seed","type":"INT","widget":{"name":"noise_seed"},"link":158}],"outputs":[{"localized_name":"NOISE","name":"NOISE","type":"NOISE","links":[142]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"RandomNoise","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[432262096973490,"randomize"]},{"id":70,"type":"UNETLoader","pos":[-464.6666797319973,228.66668403603927],"size":[364.42708333333337,82],"flags":{},"order":8,"mode":4,"inputs":[{"localized_name":"unet_name","name":"unet_name","type":"COMBO","widget":{"name":"unet_name"},"link":159},{"localized_name":"weight_dtype","name":"weight_dtype","type":"COMBO","widget":{"name":"weight_dtype"},"link":null}],"outputs":[{"localized_name":"MODEL","name":"MODEL","type":"MODEL","links":[139]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"UNETLoader","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":["flux-2-klein-9b-fp8.safetensors","default"]},{"id":71,"type":"CLIPLoader","pos":[-464.6666797319973,378.66668403603927],"size":[364.42708333333337,106],"flags":{},"order":9,"mode":4,"inputs":[{"localized_name":"clip_name","name":"clip_name","type":"COMBO","widget":{"name":"clip_name"},"link":160},{"localized_name":"type","name":"type","type":"COMBO","widget":{"name":"type"},"link":null},{"localized_name":"device","name":"device","shape":7,"type":"COMBO","widget":{"name":"device"},"link":null}],"outputs":[{"localized_name":"CLIP","name":"CLIP","type":"CLIP","links":[151]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"CLIPLoader","models":[{"name":"qwen_3_8b_fp8mixed.safetensors","url":"https://huggingface.co/Comfy-Org/fl.../text_encoders/qwen_3_8b_fp8mixed.safetensors","directory":"text_encoders"}],"enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":["qwen_3_8b_fp8mixed.safetensors","flux2","default"]},{"id":72,"type":"VAELoader","pos":[-464.6666797319973,548.6666840360392],"size":[364.42708333333337,58],"flags":{},"order":10,"mode":4,"inputs":[{"localized_name":"vae_name","name":"vae_name","type":"COMBO","widget":{"name":"vae_name"},"link":161}],"outputs":[{"localized_name":"VAE","name":"VAE","type":"VAE","links":[148]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"VAELoader","models":[{"name":"flux2-vae.safetensors","url":"https://huggingface.co/Comfy-Org/flux2-dev/resolve/main/split_files/vae/flux2-vae.safetensors","directory":"vae"}],"enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":["flux2-vae.safetensors"]},{"id":63,"type":"CFGGuider","pos":[461.3333202680027,328.83333360415605],"size":[270,98],"flags":{},"order":2,"mode":4,"inputs":[{"localized_name":"model","name":"model","type":"MODEL","link":139},{"localized_name":"positive","name":"positive","type":"CONDITIONING","link":140},{"localized_name":"negative","name":"negative","type":"CONDITIONING","link":164},{"localized_name":"cfg","name":"cfg","type":"FLOAT","widget":{"name":"cfg"},"link":null}],"outputs":[{"localized_name":"GUIDER","name":"GUIDER","type":"GUIDER","links":[143]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"CFGGuider","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[1]},{"id":76,"type":"ConditioningZeroOut","pos":[190,640],"size":[204.134765625,26],"flags":{},"order":13,"mode":4,"inputs":[{"localized_name":"conditioning","name":"conditioning","type":"CONDITIONING","link":163}],"outputs":[{"localized_name":"CONDITIONING","name":"CONDITIONING","type":"CONDITIONING","links":[164]}],"properties":{"cnr_id":"comfy-core","ver":"0.9.1","Node name for S&R":"ConditioningZeroOut","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[]},{"id":74,"type":"CLIPTextEncode","pos":[-34.666679731997306,228.83333360415605],"size":[440,360],"flags":{},"order":12,"mode":4,"inputs":[{"localized_name":"clip","name":"clip","type":"CLIP","link":151},{"localized_name":"text","name":"text","type":"STRING","widget":{"name":"text"},"link":162}],"outputs":[{"localized_name":"CONDITIONING","name":"CONDITIONING","type":"CONDITIONING","slot_index":0,"links":[140,163]}],"title":"CLIP Text Encode (Positive Prompt)","properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"CLIPTextEncode","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[""],"color":"#232","bgcolor":"#353"},{"id":62,"type":"Flux2Scheduler","pos":[461.3333202680027,568.833333604156],"size":[270,106],"flags":{},"order":1,"mode":4,"inputs":[{"localized_name":"steps","name":"steps","type":"INT","widget":{"name":"steps"},"link":null},{"localized_name":"width","name":"width","type":"INT","widget":{"name":"width"},"link":137},{"localized_name":"height","name":"height","type":"INT","widget":{"name":"height"},"link":138}],"outputs":[{"localized_name":"SIGMAS","name":"SIGMAS","type":"SIGMAS","links":[145]}],"properties":{"cnr_id":"comfy-core","ver":"0.8.2","Node name for S&R":"Flux2Scheduler","enableTabs":false,"tabWidth":65,"tabXOffset":10,"hasSecondTab":false,"secondTabText":"Send Back","secondTabOffset":80,"secondTabWidth":65},"widgets_values":[4,1024,1024]}],"groups":[{"id":1,"title":"Models","bounding":[-470,140,380,550],"color":"#3f789e","font_size":24,"flags":{}},{"id":2,"title":"Prompt","bounding":[-50,140,470,550],"color":"#3f789e","font_size":24,"flags":{}},{"id":3,"title":"Sampler","bounding":[460,140,532.3638671875,550],"color":"#3f789e","font_size":24,"flags":{}},{"id":4,"title":"Image Size","bounding":[-470,720,380,350],"color":"#3f789e","font_size":24,"flags":{}}],"links":[{"id":137,"origin_id":68,"origin_slot":0,"target_id":62,"target_slot":1,"type":"INT"},{"id":138,"origin_id":69,"origin_slot":0,"target_id":62,"target_slot":2,"type":"INT"},{"id":139,"origin_id":70,"origin_slot":0,"target_id":63,"target_slot":0,"type":"MODEL"},{"id":140,"origin_id":74,"origin_slot":0,"target_id":63,"target_slot":1,"type":"CONDITIONING"},{"id":142,"origin_id":73,"origin_slot":0,"target_id":64,"target_slot":0,"type":"NOISE"},{"id":143,"origin_id":63,"origin_slot":0,"target_id":64,"target_slot":1,"type":"GUIDER"},{"id":144,"origin_id":61,"origin_slot":0,"target_id":64,"target_slot":2,"type":"SAMPLER"},{"id":145,"origin_id":62,"origin_slot":0,"target_id":64,"target_slot":3,"type":"SIGMAS"},{"id":146,"origin_id":66,"origin_slot":0,"target_id":64,"target_slot":4,"type":"LATENT"},{"id":147,"origin_id":64,"origin_slot":0,"target_id":65,"target_slot":0,"type":"LATENT"},{"id":148,"origin_id":72,"origin_slot":0,"target_id":65,"target_slot":1,"type":"VAE"},{"id":149,"origin_id":68,"origin_slot":0,"target_id":66,"target_slot":0,"type":"INT"},{"id":150,"origin_id":69,"origin_slot":0,"target_id":66,"target_slot":1,"type":"INT"},{"id":151,"origin_id":71,"origin_slot":0,"target_id":74,"target_slot":0,"type":"CLIP"},{"id":153,"origin_id":65,"origin_slot":0,"target_id":-20,"target_slot":0,"type":"IMAGE"},{"id":155,"origin_id":-10,"origin_slot":0,"target_id":68,"target_slot":0,"type":"INT"},{"id":157,"origin_id":-10,"origin_slot":1,"target_id":69,"target_slot":0,"type":"INT"},{"id":158,"origin_id":-10,"origin_slot":2,"target_id":73,"target_slot":0,"type":"INT"},{"id":159,"origin_id":-10,"origin_slot":3,"target_id":70,"target_slot":0,"type":"COMBO"},{"id":160,"origin_id":-10,"origin_slot":4,"target_id":71,"target_slot":0,"type":"COMBO"},{"id":161,"origin_id":-10,"origin_slot":5,"target_id":72,"target_slot":0,"type":"COMBO"},{"id":162,"origin_id":-10,"origin_slot":6,"target_id":74,"target_slot":1,"type":"STRING"},{"id":163,"origin_id":74,"origin_slot":0,"target_id":76,"target_slot":0,"type":"CONDITIONING"},{"id":164,"origin_id":76,"origin_slot":0,"target_id":63,"target_slot":2,"type":"CONDITIONING"}],"extra":{"workflowRendererVersion":"LG"}}]},"config":{},"extra":{"frontendVersion":"1.38.13","workflowRendererVersion":"LG","VHS_latentpreview":false,"VHS_latentpreviewrate":0,"VHS_MetadataImage":true,"VHS_KeepIntermediate":true,"ds":{"scale":1.0350103602245235,"offset":[53.393270288855454,-136.41496939021818]}},"version":0.4}
 
Zuletzt bearbeitet:
Ich hab mal versucht mich durch ein paar der letzten Seiten zu klicken, leider sagt mir 80% des Geschriebenem gar nichts.

Wenn ich lokal Bilder generieren will, muss ich mir dann stablediffusion anschauen? Bzw. macht das überhaupt Sinn lokal, mit einem Rechner wie in meiner Signatur beschrieben?
 
@Keuleman : Scheint sich nicht öffnen zu lassen. Mach mal exportieren und dann die json Datei in eine Zip Datei, dann kannst Du's anhängen.
 
Snakeeater schrieb:
Ich hab mal versucht mich durch ein paar der letzten Seiten zu klicken, leider sagt mir 80% des Geschriebenem gar nichts.

Wenn ich lokal Bilder generieren will, muss ich mir dann stablediffusion anschauen? Bzw. macht das überhaupt Sinn lokal, mit einem Rechner wie in meiner Signatur beschrieben?

Du kannst den aktuellsten "Adrenalin" Treiber von AMD installieren und dort das AI Paket mit installieren, da ist auch eine comfyUI Version dabei (damit erstellst Du Bilder).

Alternativ einfach die AMD Variante hier https://github.com/Comfy-Org/ComfyUI/releases herunterladen. In dem Programm gibts Vorlagen (Templates) und dann gibts natürlich zahlreiche Tutorials usw. auf Youtube.

Ich würde zwar den Weg zwei bevorzugen, weil das dass aktuelle Paket vom Entwickler selbst ist, aber dann musst Du wsl. trotzdem, um die ROCm Untersützung zu haben, den Pytorch Teil des AI Paketes (das mit dem Treiber kommt) installiert haben.

Evtl können AMD User hier noch mehr dazu sagen :)
 
Zuletzt bearbeitet:
Zurück
Oben