Differences between Colab and Rented GPUS a Short Comparison
In this quick article i'm just going to share a few tips, and try and list what rentable services are available without trying to shove my opinions down your throat!
NOTE: This is largely not targeted at those of you still using free Tier colab, nor is this an attack on you -Google is just a large pain in the @$$ and doesn't give out services easy enough. Also I'm not including websites like Magespace, Pirate Diffusion, Run Diffusion or Dreamlike, as those are different but amazing services and aren't the same as rented gpus or colab in this case.
2nd Note: Not knocking ANYONE making colab notebooks for SD/KOHYASS etc- it's not your fault Google Colab is slow.
As of around May 2023 - Free Tier GPUs on Google Colab are breaking terms of Service for using any Remote UI. However, the change was largely against Stable Diffusion users, in particular Automatic 1111 users from what was seen.
In theory this would include any gradio instance.
This doesn't mean you can't cheat the system, and while we're not advocating for it (because i'm not about to get my last vestiges of google services shut off) - clearly there are tutorials on how to do this. Not sure why you'd want to if you COULD afford paid services, if you can't that's understandable i'm not judging.
As a FORMER (we're cancelling as of this month) Colab user, as my credits have winded down -I'm here to tell you: This is the worst option for training, and unless you can afford the top tier GPU's on the Colab plans - it's not worth it even with the best notebook setups.
I know, I said i wasn't going to shove my opinions down anyone's gullet - but I just sat through training a lora at 1.21 it/s. That's on a T4.
If you're looking for a cheaper option please move forward, I'm sorry - For stable diffusion I have no love for Google Colab. There ARE likely things it's great for, and most of thats programmers who don't need the larger GPUS.
Rented GPUs (and a tiny comparison)
While rented GPU's usually from what i've seen tend to use DOCKER or jupyter instances, once you get the hang of how easy it is - you'll largely wanna leave colab in the dust.
These tend to have FASTER iterations per second, more stability in the CPU's and overall computing setups.
You can get DIFFERENT types of GPU's depending on the service.
Take runpod as a starting point, you can get a 25+ Gigabyte (as in beyond the usual 15-18 google gives you) for less than 45 cents an hour depending on your storage requirements.
You have MANY options, and the pricing depends largely on your requirements for speed, disk space and GPU size.
When I'm off merging or training models I pick CHEAPER but stable options under 40 cents an hour, and right now on Vast AI i'm picking RTX 3090s at 20 cents an hour in US dollars.
Still not SUPER CHEAP, I mean i'm paying 100 bucks a month largely in AI fees at this stage, but if you're doing a lot of training, and a lot of work - and you don't have something at home you can use?
Rented GPUs seem to be a more stable option.
Many have stated that certain ones are better than others, but the comparison here is quick - and I'm not trying to lord one or the other. It's up to you to figure it out.
Links to Services
Please note I've only used Runpod and Vast. A lot of these services are different based on countries and otherwise, so check their SSL and Terms of Services before you order an instance.
If you have any suggestions towards other services please feel free to let me know down below.
This is not really a huge article, it's just me sharing what i've felt is an interesting thing about renting GPUs.
Btw: Cloud GPU's at Google are rented by the YEAR it looks like, and they're more worried about LARGE scale datacenter use and never should have started their colab service the way they did lol.