this post was submitted on 28 Feb 2024
1 points (100.0% liked)

Stable Diffusion

4297 readers
1 users here now

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 0 points 8 months ago* (last edited 8 months ago) (2 children)

No, lol. Well, at least I'm not 100% familiar with Pis new offerings, but idk about their PCI-E capabilities. Direct quote:

The tool can run on low-cost graphics processing units (GPUs) and needs roughly 8GB of RAM to process requests — versus larger models, which need high-end industrial GPUs.

Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.

Have been running all the image generation models on a 2060 super (8GB VRAM) up to this point including SD-XL, the model they "distilled" theirs from... Not really sure what exactly they think they are differentiating themselves from, reading the article...

[–] [email protected] 0 points 8 months ago

There are three models and the smallest one is 700M parameters.

[–] [email protected] 0 points 8 months ago (1 children)

Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.

Jeff Geerling has entered the chat

[–] [email protected] 0 points 8 months ago

Here is an alternative Piped link(s):

Jeff Geerling has entered the chat

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.