this post was submitted on 18 Aug 2024
220 points (97.4% liked)

Linux

48143 readers
837 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

I'm writing a program that wraps around dd to try and warn you if you are doing anything stupid. I have thus been giving the man page a good read. While doing this, I noticed that dd supported all the way up to Quettabytes, a unit orders of magnitude larger than all the data on the entire internet.

This has caused me to wonder what the largest storage operation you guys have done. I've taken a couple images of hard drives that were a single terabyte large, but I was wondering if the sysadmins among you have had to do something with e.g a giant RAID 10 array.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 1 points 2 months ago

I think it would be my whole broken manjaro install, I just used dd to make a copy so I could work on it later lol. About 500 gigs

[–] [email protected] 14 points 2 months ago

a .png of your mom's width

[–] [email protected] 7 points 2 months ago

Approximately 2 petabytes.

[–] [email protected] 2 points 2 months ago* (last edited 2 months ago)

20TB (out of 21TB usable), a second 6x6TB zfs raidz2 server as my send target.

[–] [email protected] 11 points 2 months ago

I worked at a niche factory some 20 years ago. We had a tape robot with 8 tapes at some 200GB each. It'd do a full backup of everyone's home directories and mailboxes every week, and incremental backups nightly.

We'd keep the weekly backups on-site in a safe. Once a month I'd do a run to another plant one town over with a full backup.

I guess at most we'd need five tapes. If they still use it, and with modern tapes, it should scale nicely. Today's LTO-tapes are 18TB. Driving five tapes half an hour would give a nice bandwidth of 50GB/s. The bottleneck would be the write speed to tape at 400MB/s.

[–] [email protected] 5 points 2 months ago

4 TB over my home network. 800GB download from a external server.

[–] [email protected] 7 points 2 months ago

I downloaded that 200gb leak from national public data the other day, maybe not the biggest total but certainly the largest single text file ive ever messed with

[–] [email protected] 2 points 2 months ago

I recently copied ~1.6T from my old file server to my new one. I think that may be my largest non-work related transfer.

[–] [email protected] 10 points 2 months ago

Back in the late 90’s I worked for an internet search company, long before Google was a thing. We would regularly physically drive a dozen SCSI drives from a RAID array between two datacenters about 20 miles apart.

[–] [email protected] 11 points 2 months ago (2 children)

8 TB but I'm just a regular Joe with a penchant for piracy.

[–] [email protected] 1 points 2 months ago
[–] [email protected] 4 points 2 months ago
[–] [email protected] 7 points 2 months ago

My Chia crypto farm at its peak had about 1.5 PB of plots, each plot was I think about 100ish gigs? I'd plot them on a dedicated machine and then move them to storage for farming. I think I'd move around 10TB per night.

It was done with a combination of powershell and bash scripts on Windows, Linux, and the built in Windows Services for Linux.

[–] [email protected] 25 points 2 months ago

It was something around 40 TB X2 . We were doing a terrain analysis of the entire Earth. Every morning for 25 days I would install two fresh drives in the cluster doing the data crunching and migrate the filled drives to our file server rack.

The drives were about 80% full and our primary server was mirrored to two other 50 drive servers. At the end of the month the two servers were then shipped to customer locations.

[–] [email protected] 12 points 2 months ago (1 children)

A few years back I worked at a home. They organised the whole data structure but needed to move to another Providor. I and my colleagues moved roughly just about 15.4 TB. I don't know how long it took because honestly we didn't have much to do when the data was moving so we just used the downtime for some nerd time. Nerd time in the sense that we just started gaming and doing a mini LAN party with our Raspberry and banana pi's.

Surprisingly the data contained information of lots of long dead people which is quiet scary because it wasn't being deleted.

[–] [email protected] 4 points 2 months ago

No idea about which specific type of business it is, but keeping that history long term can have some benefits, especially to outside people. Some government agencies require companies to keep records for a certain number of years. It could also help out in legal investigations many years in the future and show any auditors you keep good records. From a historical perspective, it can be matched to census, birth, and death certificates. A lot of generational history gets lost.

Companies also just hoard data. Never know what will be useful later. shrug

load more comments
view more: next ›