this post was submitted on 14 Feb 2024
1 points (100.0% liked)
LocalLLaMA
2249 readers
1 users here now
Community to discuss about LLaMA, the large language model created by Meta AI.
This is intended to be a replacement for r/LocalLLaMA on Reddit.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It means that they want people to consult the code as a reference for how to best use the hardware acceleration.
If all software uses their cards to best effect, that makes their cards more useful and thus more valuable; making them money. If only their own frontend can do that, they lose out on most of that, while also having to spend money to make sure that the rest of the software, like the UI, is competitive.
Ah, that makes sense. Thank you!