Low VRAM Adventures
A new series
It feels weird to say that but I intend to start a series about Stable Diffusion and ComfyUI on older slower hardware.
This is the announcement as well as the reading list that I will update whenever I publish a new episode. The series may stay on CivitAI or move to a monetized (ad supported) platform in the future.
For this series, I define Low VRAM as GPUs of 6-8 Gb of VRAM.
I will not address anything below 6Gb.
The reference system I'll be using has the following specifications:
GPU: Nvidia 1060 6Gb running since 2017
CPU: Intel i5 8400
RAM: 16gb DDR4 2133Mhz
Any changes to the test system will be announced.
Come on, don't be a kill joy.
If you are blessed to have access to much better, faster and newer systems, you'll be shocked to know that many people are not.
If you are puzzled at the idea to work with ancient hardware while there are so many online services to experiment for free or very little subscription costs, you may be surprised that not everyone want or can use such services.
I also believe that lowering the cost of entry is extremely beneficial for education. If a class can learn about generative AI with a fleet of second hand 7 year old GPUs, it will be so much easier to setup, or budget, or donate.
There are also other individuals who may not be able to justify an expense for the sake of exploring generative AI.
And another set of practical reasons:
Optimizing on older slower hardware helps optimizing for newer hardware
Ability to turn older hardware into secondary render servers
Power consumption limitations
You get the idea.
What to expect
These articles will be random in their topics but they will have one goal in common, find ways to improve render time, improving render quality or unlocking render on Low VRAM GPUs.
List of Articles
N – Title — Publish Time — Link
1 – Cascade vs SD XL — Feb 24th 2024 — 🔗 Link to the article
TBD – AnimateDiff Optimizations — TBD — TBD