will_a113

joined 2 years ago
[–] [email protected] 2 points 1 week ago

All too real.

[–] [email protected] 3 points 2 weeks ago

The ultimate hidden truth of the world is that it is something that we make, and could just as easily make differently.

And have made differently in the past.

While we're all living in the present it's extra-important to acknowledge the successes (and sometimes catastrophic failures) of different civilizations of the past. The way we're living now is not the only way we've ever lived as a species, but we seem amazingly incapable of learning from past successes and failures sometimes.

[–] [email protected] 3 points 2 weeks ago (1 children)

Not that we have any real info about who collects/uses what when you use the API

[–] [email protected] 2 points 2 weeks ago (1 children)

I’m not too sure about varietals of any of the trees. One mango I know is called a lemon meringue mango, and as you might guess is very citrusy. It’s much smaller and paler than the usual Caribbean mangoes at the supermarket. Likewise not sure about either avocado. One is what’s colloquially called a Florida avocado. It’s huge - like bigger than a softball - with a smooth, bright green skin. The flesh is a bit watery, to the point where I use cheesecloth to wring it out if making guac. Milder than a haas as well. The other variety is really interesting. It ripens on the vine until it is dark purple or almost black, like an eggplant. This one is delicious and slightly floral. I haven’t seen any fruits on either tree again this year, so something is definitely up. An arborist was over a few years ago to do some pruning and didn’t mention anything problematic about either, so it will likely take some research to figure out. I’m not aware of other avocado trees in the neighborhood, but certainly one possibility is that they’ve lost their pollinators.

[–] [email protected] 53 points 2 weeks ago

It’s called “The Tiffany Problem”. You might want to use the historically accurate name Tiffany for a character in your 16th century historical fiction novel, but you can’t because it sounds like someone who was born in 1982.

[–] [email protected] 4 points 2 weeks ago

Nobody knows! There's no specific disclosure that I'm aware of (in the US at least), and even if there was I wouldn't trust any of these guys to tell the truth about it anyway.

As always, don't do anything on the Internet that you wouldn't want the rest of the world to find out about :)

[–] [email protected] 4 points 2 weeks ago (3 children)

They're talking about what is being recorded while the user is using the tools (your prompts, RAG data, etc.)

[–] [email protected] 1 points 2 weeks ago

If money counts as a freedom unit then yes, probably (maybe)

[–] [email protected] 3 points 2 weeks ago

Anthropic and OpenAPI both have options that let you use their API without training the system on your data (not sure if the others do as well), so if t3chat is simply using the API it may be that they themselves are collecting your inputs (or not, you'd have to check the TOS), but maybe their backend model providers are not. Or, who knows, they could all be lying too.

[–] [email protected] 29 points 2 weeks ago (3 children)

And I can't possibly imagine that Grok actually collects less than ChatGPT.

 

With, I think, a massive grain of salt since this info is unverified and direct from the manufacturer...

Huawei’s official presentation claims their Cloudmatrix 385 supercomputer delivers 300 PFLOPS of computing power, 269 TB/s of network bandwidth, and 1,229 TB/s of total memory bandwidth. It also achieves 55 percent model fitting utilization (MFU) during training workloads and offers 2.8 Tbps of inter-card bandwidth, heavily emphasizing its strength in networking.

| Spec            | NVL72 (Nvidia) | CloudMatrix 384 (Huawei) | Better? (%) |
|-----------------|----------------|--------------------------|------------|
| Total compute   | 180 Pflops     | 300 Pflops               | 67%        |
| Total network bw| 130 TB/s       | 269 TB/s                 | 107%       |
| Total mem bw    | 576 TB/s       | 1,229 TB/s               | 113%       |
410
submitted 2 weeks ago* (last edited 2 weeks ago) by [email protected] to c/[email protected]
 

A chart titled "What Kind of Data Do AI Chatbots Collect?" lists and compares seven AI chatbots—Gemini, Claude, CoPilot, Deepseek, ChatGPT, Perplexity, and Grok—based on the types and number of data points they collect as of February 2025. The categories of data include: Contact Info, Location, Contacts, User Content, History, Identifiers, Diagnostics, Usage Data, Purchases, Other Data.

  • Gemini: Collects all 10 data types; highest total at 22 data points
  • Claude: Collects 7 types; 13 data points
  • CoPilot: Collects 7 types; 12 data points
  • Deepseek: Collects 6 types; 11 data points
  • ChatGPT: Collects 6 types; 10 data points
  • Perplexity: Collects 6 types; 10 data points
  • Grok: Collects 4 types; 7 data points
[–] [email protected] 12 points 2 weeks ago (1 children)

Gene sequencing wasn’t really a thing (at least an affordable thing) until the 2010s, but once it was widely available archaeologists started using it on pretty much anything they could extract a sample from. Suddenly it became possible to track the migrations of groups over time by tracing gene similarities, determine how much intermarrying there must have been within groups, etc. Even with individual sites it has been used to determine when leadership was hereditary vs not, or how wealth was distributed (by looking at residual food dna on teeth). It really has revolutionized the field and cast a lot of old-school theories (often taken for truth) into the dustbin.

[–] [email protected] 65 points 2 weeks ago (3 children)

That humans came out of Africa once and then settled the rest of the world. In reality there was a constant migration of humans in and out of Africa for millennia while the rest of the world was being populated (and of course it hasn’t ever stopped since).

I love how much DNA analysis has completely upended so much “known” archaeology and anthropology from even just a couple decades ago.

 

I was reading this article about the NYT's suit against OpenAI. OpenAI argued that NYT couldn't sue for damages because it had been "too long" since the infringing started, and since NYT "must have known" that OpenAI was doing it, they lost the privilege of collecting damages (IANAL but I think it's because the Doctrine of Laches). In any event, the judge sensibly threw this argument out, telling OpenAI they hadn't demonstrated that NYT could have known the size or scale or timing of the any alleged infringement.

This made me think: now that the cat is out of the bag and everyone DOES know that everything on the Internet (and beyond) is being fed into AI factories, do we as creators have an obligation to somehow collectively sue LLM makers so that laches can't be used as a defense in the future?

 

While not the gigantic uber-canines of fantasy lore, these pups will become roughly-gray-wolf-sized dire wolves, and represent the first de-extincted animal species, raising a number of ethical questions about returning animals to ecosystems that may not be stable for long.

 

The foundation will focus on improving ActivityPub and the user experience, informing policymakers, and educating people about the fediverse and how they can participate. They currently have some backing from Meta, Flipboard, Ghost, Mastodon, and others, and the Ford Foundation has also offered the organization a large grant to get the project started. In total, SWF is closing in on $1 million in financial support (or was, as of September)

 

In south Florida at this time of year we see lots of beached Portuguese man-o-wars. They can sting like a jellyfish, but are actually a "colony" of 4 separate polyps that all live together. Often just the bladder remains, but sometimes they'll still have their full array of tentacles, which can reach 10 feet (and will most definitely still sting you if you touch them)

 

A domestic breeding program kept these birds from going extinct. An initial reintroduction to their native habitat on the big island was halted after their natural predators proved too adept (or the coddled crows proved not adept enough, I guess). So they're now being relocated to Maui.

 

File this under "small wins". I had been banging my head against a technical problem for most of the day yesterday. As I slipped into bed around midnight, I suddenly knew the solution. Despite the call of the pillows, I dragged myself out of bed, down to the laptop and took a full 30 seconds to write it down -- and thank goodness, because by this morning I had forgotten about it again!

12
submitted 2 months ago* (last edited 2 months ago) by [email protected] to c/[email protected]
 

Crowds and water have more in common than you'd think - they both flow like a fluid, with predictable patterns that can turn perilous if not properly managed. Looks like the physics of human herds is no bull, as researchers have uncovered the fluid dynamics behind dangerous crowd crushes.

 

Using Reddit's popular ChangeMyView community as a source of baseline data, OpenAI had previously found that 2022's ChatGPT-3.5 was significantly less persuasive than random humans, ranking in just the 38th percentile on this measure. But that performance jumped to the 77th percentile with September's release of the o1-mini reasoning model and up to percentiles in the high 80s for the full-fledged o1 model.

So are you smarter than a Redditor?

 

When even Cory Doctrow starts to sound like an optimist I have to give myself a reality check as it usually means I'm heading off the deep end. But in this case it just rubs me the wrong way that he talks about Mastodon and Bluesky in the same breath -- one is not like the other.

view more: next ›