Sign In

[ext|node] Using LLM trigger more detail, that u never thought.

[ext|node] Using LLM trigger more detail, that u never thought.

SD-WEB-UI | ComfyUI | decadetw-Auto-Prompt-LLM-Vision

Motivation💡

  • Call LLM : auto prompt for batch generate images

  • Call LLM-Vision: auto prompt for batch generate images

  • Image will get more details that u never though before.

  • prompt detail is important

Usage

LLM-Text

  • batch image generate with LLM

    • a story

  • Using Recursive prompt say a story with image generate

  • Using LLM

    • when generate forever mode

      • example as follows figure Red-box.

      • just tell LLM who, when or what

      • LLM will take care details.

    • when a story-board mode (You can generate serial image follow a story by LLM context.)

      • its like comic book

      • a superstar on stage

      • she is singing

      • people give her flower

      • a fashion men is walking.

LLM-Vision 👀

  • batch image generate with LLM-Vision

    • let LLM-Vision see a magazine

    • see series of image

    • see last-one-img for next-image

    • make a serious of image like comic

Before and After script


ComfyUI Workflow perview

Flux + Auto-LLM + Auto-Msg

ComfyUI Manager | search keyword: auto


Usage Tips

  • tips1:

    • leave only 1 or fewer keyword(deep inside CLIP encode) for SD-Prompt, others just fitting into LLM

    • SD-Prompt: 1girl, [xxx,]<--(the keyword u use usually, u got usually image)

    • LLM-Prompt: xxx, yyy, zzz, <--(move it to here; trigger more detail that u never though.)

  • tips2:

    • leave only 1 or fewer keyword(deep inside CLIP encode) for SD-Prompt, others just fit into LLM

    • SD-Prompt: 1girl,

    • LLM-Prompt: a superstar on stage. <--(say a story)

  • tips3:

    • action script - Before

      • random/series pick prompt txt file random line fit into LLM-Text [read_random_line.bat]

      • random/series pick image path file fit into LLM-Vision

    • action script - After

      • u can call what u want command

      • ex: release LLM VRAM each call: "curl http://localhost:11434/api/generate -d '{"model": "llama2", "keep_alive": 0}'" @Pdonor

      • ex: bra bra. Interactive anything.

  • tipsX: Enjoy it, inspire ur idea, and tell everybody how u use this.

Installtion

Suggestion software info list

68747470733a2f2f6c6d73747564696f2e61692f7374617469632f6d656469612f64656d6f322e39646635613065356139663164373237313565302e676966

Suggestion LLM Model

Javascript!

security issue, but u can consider as follows.

Buy me a Coca cola ☕

https://buymeacoffee.com/xxoooxx

Colophon

Made for fun. I hope if brings you great joy, and perfect hair forever. Contact me with questions and comments, but not threats, please. And feel free to contribute! Pull requests and ideas in Discussions or Issues will be taken quite seriously! --- https://decade.tw

12

Comments