iByteABit

joined 1 year ago
[–] [email protected] 2 points 7 months ago (3 children)

That's the reason why UBI and raising the minimum wage isn't a permanent solution for the working class though. They are good and provide some relief certainly, but in the end the corporations will exploit even harder, raise prices causing inflation that makes the raises unimportant, and use all that as a debate point to show that giving back to the workers leads to price raises.

Workers should still fight for raises, benefits and better contracts, but the end goal should be ending the system that gives the owners of capital all the power in the world.

[–] [email protected] 1 points 7 months ago (1 children)

Blazor looks really cool!

[–] [email protected] 2 points 7 months ago

Yeah, the original Super Tic Tac Toe is like that, but I decided to change it a bit because it seemed more intuitive when playing.

There can also be two game modes to choose from before playing that includes the original, I'm not planning on that right now but I'd be very glad with a PR for it if someone makes that effort

[–] [email protected] 8 points 7 months ago

I had some major updates, it has an AI mode now so it's much more enjoyable for someone to play alone

[–] [email protected] 1 points 7 months ago (1 children)

You mean like Tic Tac Toe within a bigger Tic Tac Toe? That was the original idea by VSauce, but I changed it to # of games won because it felt more intuitive when playing

[–] [email protected] 5 points 7 months ago

Yeah I thought so, I hoped the quotations would make it less so.

I still think it's a catchy description though so I'm not changing it yet

 

cross-posted from: https://lemm.ee/post/27328794

A twist of Tic Tac Toe inspired by VSauce written in React + JS.

I will happily accept contributions, if you're interested you can check for any open issues or create your own!

 

cross-posted from: https://lemm.ee/post/27328794

A twist of Tic Tac Toe inspired by VSauce written in React + JS.

I will happily accept contributions, if you're interested you can check for any open issues or create your own!

 

cross-posted from: https://lemm.ee/post/27328794

A twist of Tic Tac Toe inspired by VSauce written in React + JS.

I will happily accept contributions, if you're interested you can check for any open issues or create your own!

[–] [email protected] 1 points 1 year ago (1 children)

Who orders beer online wtf

[–] [email protected] 0 points 1 year ago* (last edited 1 year ago)

Probably American Redditors that can't see past any of the narratives they've been force fed

[–] [email protected] 1 points 1 year ago (2 children)

They aren't the root, but they're a strong enabler

 

@[email protected] I'm crossposting this here just to make sure that you get to see it.

cross-posted from: https://lemmy.dbzer0.com/post/4500908

In the past months, there's a been a issue in various instances where accounts would start uploading blatant CSAM to popular communities. First of all this traumatizes anyone who gets to see it before the admins get to it, including the admins who have to review to take it down. Second of all, even if the content is a link to an external site, lemmy sill caches the thumbnail and stores it in the local pict-rs, causing headaches for the admins who have to somehow clear that out. Finally, both image posts and problematic thumbnails are federated to other lemmy instances, and then likewise stored in their pict-rs, causing such content to be stored in their image storage.

This has caused multiple instances to take radical measures, from defederating liberaly, to stopping image uploads to even shutting down.

Today I'm happy to announce that I've spend multiple days developing a tool you can plug into your instance to stop this at the source: pictrs-safety

Using a new feature from pictr-rs 0.4.3 we can now cause pictrs to call an arbitary endpoint to validate the content of an image before uploading it. pictrs-safety builds that endpoint which uses an asynchronous approach to validate such images.

I had already developed fedi-safety which could be used to regularly go through your image storage and delete all potential CSAM. I have now extended fedi-safety to plug into pict-rs safety and scan images sent by pict-rs.

The end effect is that any images uploaded or federated into your instance will be scanned in advance and if fedi-safety thinks they're potential CSAM, they will not be uploaded to your image storage at all!

This covers three important vectors for abuse:

  • Malicious users cannot upload CSAM to for trolling communities. Even novel GenerativeAI CSAM.
  • Users cannot upload CSAM images and never submit a post or comment (making them invisible to admins). The images will be automatically rejected during upload
  • Deferated images and thumbnails of CSAM will be rejected by your pict-rs.

Now, that said, this tool is AI-driven and thus, not perfect. There will be false positives, especially around lewd images and images which contain children or child-topics (even if not lewd). This is the bargain we have to take to prevent the bigger problem above.

By my napkin calculations, false positive rates are below 1%, but certainly someone's innocent meme will eventually be affected. If this happen, I request to just move on as currently we don't have a way to whitelist specific images. Don't try to resize or modify the images to pass the filter. It won't help you.

For lemmy admins:

  • pictrs-safety contains a docker-compose sample you can add to your lemmy's docker-compose. You will need to your put the .env in the same folder, or adjust the provided variables. (All kudos to @[email protected] for the docker support).
  • You need to adjust your pict-rs ENVIRONMENT as well. Check the readme.
  • fedi-safety must run on a system with GPU. The reason for this is that lemmy provides just a 10-seconds grace period for each upload before it times out the upload regardless of the results. A CPU scan will not be fast enough. However my architecture allows the fedi-safety to run on a different place than pictrs-safety. I am currently running it from my desktop. In fact, if you have a lot of images to scan, you can connect multiple scanning workers to pictrs-safety!
  • For those who don't have access to a GPU, I am working on a NSFW-scanner which will use the AI-Horde directly instead and won't require using fedi-safety at all. Stay tuned.

For other fediverse software admins

fedi-safety can already be used to scan your image storage for CSAM, so you can also protect yourself and your users, even on mastodon or firefish or whatever.

I will try to provide real-time scanning in the future for each software as well and PRs are welcome.

Divisions by zero

This tool is already active now on divisions by zero. It's usage should be transparent to you, but do let me know if you notice anything wrong.

Support

If you appreciate the priority work that I've put in this tool, please consider supporting this and future development work on liberapay:

https://liberapay.com/db0/

All my work is and will always be FOSS and available for all who need it most.

[–] [email protected] 1 points 1 year ago

It begins with free speech, then you skip a few years and suddenly trans kids are scared for their lives. Speech affects people and has consequence, it is not something to take lightly.

[–] [email protected] 1 points 1 year ago

Microsoft as usual trying to make a Swiss Army knife where none of the tools work as intended

[–] [email protected] 0 points 1 year ago (5 children)

If the Great Filter theory is correct, climate change will most likely be our Great Filter.

Our species is simply not equipped with the ability to deal with the problems it created. Many people can, but they're not powerful to do anything, and there's too many uneducated people for the masses to rise up about this problem.

We think so short term, it's impossible for some people to think about the future and accept that we'll need to change the way we live now so that we can keep living then. They're hung up on Chernobyl because it was a big bang that killed lots of people at once and it was televised everywhere that has a society and TVs, but they are unable to see that in the long term coal and gas have killed and are still killing way more people than nuclear accidents, because it's a process that's continuous and kills people in indirect ways instead of a big blast.

view more: next ›