this post was submitted on 13 Mar 2024
1 points (100.0% liked)

Hardware

5006 readers
1 users here now

This is a community dedicated to the hardware aspect of technology, from PC parts, to gadgets, to servers, to industrial control equipment, to semiconductors.

Rules:

founded 4 years ago
MODERATORS
 

Where I live, DRAM-less SSDs are a lot cheaper (half the price). Most sources online say "go for an SSD with DRAM". But I wonder, are cases in which a DRAM-less SSD will do just fine?

My main focus is resurrecting old laptops (from 2006 to 2015), installing GNU/Linux and an sometimes investing in an SSD will give them a performance boost, but the budget is limited because I can't sella uch an old laptop at a non very budgety price.

top 15 comments
sorted by: hot top controversial new old
[–] [email protected] 0 points 8 months ago (1 children)

So, according to most answers. A DRAM-less SSD would do for me. Thanks!

[–] [email protected] 0 points 8 months ago (1 children)

Or we have a dispute over the issue?

[–] [email protected] 0 points 7 months ago

As long as it doesn’t get very full, like 90%+ there will be 0 difference in your case.

[–] [email protected] 0 points 8 months ago (1 children)

In most cases, DRAM-less makes little difference for the average user. The biggest difference is for very large transfers, like copying large games between drives. Either way, it's an easy 3-to-5 times performance upgrade compared to an HDD.

[–] [email protected] 0 points 8 months ago (1 children)

I don't even understand how dram ssd would be significantly faster outside of benchmarks.

The OS caches everything to PC dram and sends it out to the SSD. So adding more ram to your PC would have the same effect.

In benchmarks, the dram ssd appears to be much faster by return control to the OS much sooner. But a non dram ssd is getting data from PC cache dram via dma and that's not impacting CPU load. So it's not really improving the speed.

[–] [email protected] 0 points 8 months ago (1 children)

It allows the drive to be used more quickly. If you've ever tried using a computer while the disk is at 100% usage, you'll have noticed that anything you do that requires disk access slows to a crawl. With DRAM on the drive, it takes more to overload the drive and makes smaller transfers nearly instant, as data gets buffered into the much faster DRAM rather than directly to the SSD.

Like I mentioned though, in most cases the average user won't notice a difference. If you really want to squeeze a bit of extra performance out of your drive, that's where you'll want the DRAM. If you're just trying to get old laptops running well again, it's basically a non-factor.

[–] [email protected] 0 points 8 months ago* (last edited 8 months ago) (1 children)

It allows the drive to be used more quickly

But not anymore so than adding the same amount of dram to the PC. It's cpu->cpu dram->SSD dram->ssd. It will only show a performance difference on benchmarks or if your PC ram is completely full. You could get more performance by adding dram to the PC and telling the OS to never go below X amount of disk cache.

makes smaller transfers nearly instant, as data gets buffered into the much faster DRAM rather than directly to the SSD.

That's not actual speed but benchmark speed. A copy is going to PC cache and then gets written out to the SSD. Having SSD dram allows the SSD to say "done" sooner to the OS despite it taking the same total time.

[–] [email protected] 0 points 8 months ago (1 children)

Having SSD dram allows the SSD to say "done" sooner to the OS despite it taking the same total time.

That's exactly why. When writing to a drive the OS waits until the disk says "done" and then goes about it's business.

If the drive then takes an extra bit of time internally to write to permanent storage that's none of the OS's business as long as it can pull that written data from "somewhere" and deliver it to the OS if asked.

[–] [email protected] 0 points 8 months ago* (last edited 8 months ago)

But it gets "done" immediately when you have write caching enabled and the file fits in ram. Which is on by default for non removable storage. It's only benchmarks which disable write caching in order to separate PC performance from the drive being tested.

[–] [email protected] 0 points 8 months ago (1 children)

As it was already commented Host Memory Buffer can to some degree replace the DRAM cache (if the SSD supports it and even then the implementation can be bad). But the specification is from 2014 so unlikely that Laptops from up to 2015 will support it.

When there is no DRAM cache on the SSD, the SSD will use the NAND flash cells as cache. This results in more wear and a shorter lifetime. Also, when the SSD gets filled up, the SSD gets significantly slower since there will be less free NAND cells to use as Cache.

[–] [email protected] 0 points 8 months ago (1 children)

I think calling it a "cache" is not precise. The primary function of the DRAM is to hold the dictionary for translating logical addresses (e.g. sectors) from the OS to the physical addresses (which NAND chip, which bank etc.). This indirection is needed for the controller to do wear leveling without corrupting the filesystem.

On a SATA SSD without DRAM each read IO could mean 2 actual reads: first the dictionary to find the data and than the actual data being read. As you said HBM helps by eliminating this extra read.

The read and write caching is just a use of the remaining DRAM capacity. Since modern Operating Systems use the general RAM for the same function it is usually just a small increase to the throughput.

[–] [email protected] 0 points 8 months ago

The primary function of the DRAM is to hold the dictionary for translating logical addresses (e.g. sectors) from the OS to the physical addresses (which NAND chip, which bank etc.). This indirection is needed for the controller to do wear leveling without corrupting the filesystem.

That data is still only cached on the DRAM, since it is losing its data when it is no longer powered.

[–] [email protected] 0 points 8 months ago* (last edited 8 months ago)

If you're using a modern NVMe SSD you can simply ignore the presence or lack of a DRAM cache. Modern PCIe devices can use Host Memory Buffer to let the CPU map part of your RAM as the cache, and because with PCIe the CPU is the one accessing the SSD directly anyway, the cost in latency is minimal. The end result is that if you do an extremely heavy I/O benchmark you can indeed measure the difference, but if you're loading programs, saving files, playing games and whatever else, it really doesn't matter.

For SATA SSDs the difference is way more significant, but then again, if you're just restoring old laptops a DRAM-less SATA SSD will be so much faster at responding to each request compared to those little laptop HDDs that the upgrade will be more than worth it anyway, and spending extra for a DRAM cache might not be worth the machine you're dealing with.

[–] [email protected] 0 points 8 months ago (1 children)

If you plan to boot your OS off it, the benefits of a DRAM-less SSD over a traditional Hard Drive are somewhere weiteren negligible and non-existent in terms of performance. It may give you a bit better battery life, but that is all.

[–] [email protected] 0 points 8 months ago

Lol, that's extremely wrong

The boot time on my laptop is measured in minutes off a hard drive. It's seconds off an SSD