this post was submitted on 21 Jul 2024
208 points (94.4% liked)

Selfhosted

39085 readers
333 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 1 year ago
MODERATORS
 

Sorry but I can't think of another word for it right now. This is mostly just venting but also if anyone has a better way to do it I wouldn't hate to hear it.

I'm trying to set up a home server for all of our family photos. We're on our way to de-googling, and part of the impetus for the change is that our Google Drive is almost full.We have a few hundred gigs of photos between us. The problem with trying to download your data from Google is that it will only allow you to do so in a reasonable way through Google takeout. First you have to order it. Then you have to wait anywhere from a few hours to a day or two for Google to "prepare" the download. Then you have one week before the takeout "expires." That's one week to the minute from the time of the initial request.

I don't have some kind of fancy California internet, I just have normal home internet and there is just no way to download a 50gig (or 2 gig) file in one go - there are always intrruptions that require restarting the download. But if you try to download the files too many times, Google will give you another error and you have to start over and request a new takeout. Google doesn't let you download the entire archive either, you have to select each file part individually.

I can't tell you how many weeks it's been that I've tried to download all of the files before they expire, or google gives me another error.

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 5 points 1 month ago (1 children)

There was an option to split the download into archives of customizable size IIRC

[–] [email protected] 1 points 1 month ago (1 children)

Yeah, that introduces an issue of queuing and monitoring dozens of downloads rather than just a few. I had similar results.

As my family is continuing to add photos over the week, I see no way to verify that previously downloaded parts are identical to the same parts in another takeout. If that makes sense.

[–] [email protected] 2 points 1 month ago* (last edited 1 month ago) (2 children)

You could try a download manager like DownThemAll on Firefox, set a queue with all the links and a depth of 1 download at a time.

DtA has been a godsend when I had shitty ADSL. It splits download in multiple parts and manages to survive micro interruptions in the service

[–] [email protected] 1 points 1 month ago

DownloadThemAll seems to be helping. I'll update the original post with the details once I have success. In this case, I was able to first download them internally in the browser, then copy the download link and add them to DtA using the link. Someone smarter than me will be able to explain why the extra step was necessary, or how to avoid it.

[–] [email protected] 1 points 1 month ago

I couldn't get it working, but I didn't try too hard. I may give it another shot. I'm trying a different approach right now.

[–] [email protected] 1 points 1 month ago

I do occasional smaller "takeouts" and haven't had any issues.

I have an "automatic album" (or whatever they call it) where all the photos of friends and family (even pets) get automatically added to it. Then I can just request a "takeout" for that one album, since those are the photos I actually care about. It's a much smaller download than the entirety of my Photos account.

[–] [email protected] 15 points 1 month ago (1 children)

Not sure if somebody mentioned, but you can export to one drive. So you can get a 1TB account for a free trial or for a single month and export everything there as simple files, no large zips. Then with the app download to the computer and then cancel one drive.

Pretend to be in California/EU and then ask full removal of all your data on both Microsoft and google

[–] [email protected] 3 points 1 month ago

This route may be the answer. I didn't have success so far in setting up a download manager that offered any real improvements over the browser. I wanted to avoid my photos being on two corporate services, but as you say, in theory everything is delete-able.

[–] [email protected] 16 points 1 month ago (1 children)

Google takeout is the best gdpr compliant platform of all the big tech giants. Amazon for example lets you wait until the very last day they legally can.

Also they do minimal processing like with the metadata (as others commented) as it is probably how they internally store it and that's what they need to deliver. The simple fact that you can select what you want to request and not having to download everything about you makes it good in my eyes.

I actually see good faith compliance with the gdpr in the Plattform

[–] [email protected] 1 points 1 month ago

It could absolutely be worse. The main problem is the lack of flexibility - If I could ask for an extension after downloading 80% of the files over a week, that would be helpful for example. I'm also beginning to suspect that they cap the download speed because I am seeing similar speeds on my home and work network..

[–] [email protected] 3 points 1 month ago (1 children)

A 50GB download takes less than 12h on a 10Mbps internet. And I had a 10Mbps link 10 years ago in a third world country, so maybe check your options with your ISP. 50GB really should not be a problem nowadays.

[–] [email protected] 4 points 1 month ago (2 children)

It's not the speed - it's the interruptions. If I could guarantee an uninterrupted download for 12 hours, then I could do it over the course of 3-4 days. I'm looking into some of the download management tools that people here have suggested.

[–] [email protected] 1 points 1 month ago (1 children)

Get a program that can auto resume downloads?

[–] [email protected] 2 points 1 month ago (2 children)
[–] [email protected] 2 points 1 month ago

I believe jdownloader is able to.
If I am not mistaken wget and curl can resume a download as well but that may require a small script with error catching and auto loopijg until finished.

[–] [email protected] 3 points 1 month ago (1 children)

I would recommend Aria2. It can download several chunks of a file in parallel, resume downloads automatically with a set number of retries, it supports mirrors (maybe not an option for Google Takeout, but for other cases), and it can dpwnload over many different protocols.

[–] [email protected] 2 points 1 month ago (1 children)

Thanks, I'll give it a shot. The download links are a little weird due to the google authentication, so they can only be used from a logged in account.

[–] [email protected] 3 points 1 month ago

that might work; I don't know if you live in a remote area, but I'd also consider a coffee shop, library, university, or hotel lobby with wifi. You might be able to download it within an hour.

[–] [email protected] 3 points 1 month ago (1 children)

Im surprised that feature exist tbh. It worked fine for my 20GB splited into 2GB archives if I remember correctly

[–] [email protected] 2 points 1 month ago

I used it for my music collection not that long ago and had no issues. The family's photo library is an order of magnitude larger, so is putting me up against some of the limitations I didn't run into before

[–] [email protected] 14 points 1 month ago

Because Google don’t want you to export your photos. They want you to depend on them 100%.

[–] [email protected] 2 points 1 month ago

It sucked when I closed my accounts years ago. I had to do it manually for the most part.

[–] [email protected] 10 points 1 month ago (3 children)

I just have normal home internet and there is just no way to download a 50gig (or 2 gig) file in one go

"Normal" home internet shouldn't have any problem downloading 50 GB files. I download games larger than this multiple times a week.

[–] [email protected] 3 points 1 month ago* (last edited 1 month ago)

Yeah, of course it varies place to place but I think for the majority of at least somewhat developed countries and urban areas in less developed countries 50Mbps is a reasonable figure for "normal home internet" - even at 25Mbps you're looking at 4½ hours for 50GB which is very doable if you leave it going while you're at work or just in the background over the course of an evening

Edit: I was curious and looked it up. Global average download is around 50-60Mbps and upload is 10-12Mbps.

[–] [email protected] 0 points 1 month ago (1 children)

Well then read it "shitty rural internet." Use context clues.

[–] [email protected] 2 points 1 month ago

Which context clues should I be using to blame your "shitty rural internet" on Google?

[–] [email protected] 3 points 1 month ago (1 children)

they must have dialup or live in the middle of nowhere

[–] [email protected] 3 points 1 month ago (1 children)

That's fair but also not Google's fault.

[–] [email protected] 4 points 1 month ago

The part that is Google's fault is that they limit the number of download attempts and the files expire after 1 week. That should be clear form the post.

[–] [email protected] 6 points 1 month ago

Try this then do them one at the time. You have to start the download in your browser first, but you can click "pause" and leave the browser open as it downloads to your server

[–] [email protected] 3 points 1 month ago

Not really helping you here. But when I started using Google Photos, I still manually downloaded files from my phone to local storage. I did this mainly to ensure I have the original copies of my photos and not some compressed image. Turns out that was a wise move as exporting photos from Google is a pretty damned awful experience.

[–] [email protected] 4 points 1 month ago (1 children)

You could try using rclone's Google Photos backend. It's a command line tool, sort of like rsync but for cloud storage. https://rclone.org/

[–] [email protected] 10 points 1 month ago (2 children)

Looked promising until

When Images are downloaded this strips EXIF location (according to the docs and my tests). This is a limitation of the Google Photos API and is covered by bug #112096115.

The current google API does not allow photos to be downloaded at original resolution. This is very important if you are, for example, relying on "Google Photos" as a backup of your photos. You will not be able to use rclone to redownload original images. You could use 'google takeout' to recover the original photos as a last resort

[–] [email protected] 3 points 1 month ago

Oh dang, sorry about that. I've used rclone with great results (slurping content out of Dropbox, Google Drive, etc.), but I never actually tried the Google Photos backend.

[–] [email protected] 6 points 1 month ago

The word you're looking for is "petty."

[–] [email protected] 5 points 1 month ago

It's called: vendor lock-in.

[–] [email protected] 4 points 1 month ago

Use Drive or if it's more than 15GB or whatever the max is these days. Pay for storage for one month for a couple of dollars on one of the supported platforms and download from there.

load more comments
view more: next ›