this post was submitted on 15 Nov 2024
0 points (NaN% liked)
Futurology
1776 readers
31 users here now
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Ok, so if you want to run your local LLM on your desktop, use your GPU. If you’re doing that on a laptop in a cafe, get a laptop with an NPU. If you don’t care about either, you don’t need to think about these AI PCs.