catty

joined 2 weeks ago
[–] [email protected] 1 points 11 hours ago

palantir enters the chatroom

[–] [email protected] 16 points 11 hours ago (2 children)

Didn't one of the Epstein girls who called out Trump "commit suicide" earlier in the year?

[–] [email protected] 4 points 11 hours ago

but...but... lemmy is far too small for such companies paid to post these things to go after. it would cost too much more for them to just type in a few more urls to post to, so it's ok, there's none on lemmy.

but...but... reddit is far too big for such companies to go after, they wouldn't dare. reddit admins/mods would catch it and block the users.

but...but... facebook wouldn't have such content because meta has so much money they can build like reverse AI that would find it and they'd be able to automatically detect it and remove it.

[–] [email protected] 4 points 1 day ago* (last edited 1 day ago)

AI Market needs to go up and up. BUY, slaves.

[–] [email protected] 1 points 2 days ago (1 children)

But Iran couldn't counterattack before the strike. Soo.... ?

[–] [email protected] 10 points 2 days ago* (last edited 2 days ago)

"nuclear program"? Whatever happened to "nuclear weapons"? Rewriting the narrative already? Ah, we need gold to go back up everyone.

[–] [email protected] 2 points 3 days ago

Sounds like a great first question! Go for it!

 

I've just re-discovered ollama and it's come on a long way and has reduced the very difficult task of locally hosting your own LLM (and getting it running on a GPU) to simply installing a deb! It also works for Windows and Mac, so can help everyone.

I'd like to see Lemmy become useful for specific technical sub branches instead of trying to find the best existing community which can be subjective making information difficult to find, so I created [email protected] for everyone to discuss, ask questions, and help each other out with ollama!

So, please, join, subscribe and feel free to post, ask questions, post tips / projects, and help out where you can!

Thanks!

 

I've just re-discovered ollama and it's come on a long way and has reduced the very difficult task of locally hosting your own LLM (and getting it running on a GPU) to simply installing a deb! It also works for Windows and Mac, so can help everyone.

I'd like to see Lemmy become useful for specific technical sub branches instead of trying to find the best existing community which can be subjective making information difficult to find, so I created [email protected] for everyone to discuss, ask questions, and help each other out with ollama!

So, please, join, subscribe and feel free to post, ask questions, post tips / projects, and help out where you can!

Thanks!

[–] [email protected] 4 points 3 days ago

Thanks will do all that!

[–] [email protected] 2 points 3 days ago* (last edited 3 days ago) (2 children)

Start now! Install it, get a python environment up and running if you haven't already, and get that first play-around project working which you work outwards from!

[–] [email protected] 2 points 3 days ago

Thanks, I'll be checking them out! I see there was even a film made of it. It must have been a big thing. Housos looks like it was made on an absolute shoestring budget, but the editing is spot-on and acting is also on point!

110
submitted 3 days ago* (last edited 3 days ago) by [email protected] to c/[email protected]
 

I've just re-discovered ollama and it's come on a long way and has reduced the very difficult task of locally hosting your own LLM (and getting it running on a GPU) to simply installing a deb! It also works for Windows and Mac, so can help everyone.

I'd like to see Lemmy become useful for specific technical sub branches instead of trying to find the best existing community which can be subjective making information difficult to find, so I created [email protected] for everyone to discuss, ask questions, and help each other out with ollama!

So, please, join, subscribe and feel free to post, ask questions, post tips / projects, and help out where you can!

Thanks!

[–] [email protected] 1 points 4 days ago* (last edited 4 days ago)

It isn't just Bedfordshire but other forces across England. Only they are owning up to it.

This will be fun when every police officer appears at the top of the list for possibly being a wife abuser. Though I'm sure Palantir will have uhm, fixed that.

[–] [email protected] 2 points 4 days ago

You purport to being intelligent so you know what you're doing with your inflammatory responses of explicitly "opposing the draft for women".

Oh and yet more insults in your responses. I sense a theme here :(

 

We've never heard of this, but I've just come across it and within two minutes I haven't stopped laughing and know I want to binge watch it! Is it as good as it starts?

 

I've tried coding and every one I've tried fails unless really, really basic small functions like what you learn as a newbie compared to say 4o mini that can spit out more sensible stuff that works.

I've tried explanations and they just regurgitate sentences that can be irrelevant, wrong, or get stuck in a loop.

So. what can I actually use a small LLM for? Which ones? I ask because I have an old laptop and the GPU can't really handle anything above 4B in a timely manner. 8B is about 1 t/s!

view more: next ›