Depends on what you class as hobbyist but I am running
a T4 for a few minutes to get acquainted with tools and concepts
and I found modal.com really good for this. They resell AWS and GCP at the moment.
They also have A100 but T4 is all I need for now.
I think this is more applicable for training usecases. If you can get by with less than $30/mo in aws compute (quite expensive) then it likely does not make a didference.
What I mean is that you can rent out 4 3090 GPUs for much cheaper than renting an A100 on aws because you are not paying Nvidia's "cloud tax" on flops/$
Many thanks for posting about vast.ai, which I had never heard of! It's a sort of "gig economy/marketplace" for GPU's. The first machine I tried just now worked fine, had 512GB of RAM, 256 AMC CPUs, an A100 GPU, and I got about 4 minutes for $0.05 (which they provided for free).
I have never seen a GPU crunch quite like it is right now. To anyone who is interested in hobbyist ML, I highly highly recommend using vast.ai