Resource icon

Creating custom Google Photos API credentials for Cloud Sync Task

NightFish

Cadet
Joined
Jun 4, 2022
Messages
7
NightFish submitted a new resource:

Creating custom Google Photos API credentials for Cloud Sync Task - Workaround fix for Google Photos Cloud Sync Task that wont start or stays on 0%

The Issue:
Some people have encountered an issue where Google Photos Cloud Sync Tasks are stuck on 0% and will not start. If you look into the rclone logs it says "0% done, 0 Bytes/s, ETA: -".

The issue is something to do with the API credentials that get generated when you use the "Log In To Provider" button:


The Fix:
The workaround for this issue is to create your own Google Photos API credentials and use those instead.

Hint: If you decide...

Read more about this resource...
 

kalosbg

Cadet
Joined
Sep 28, 2022
Messages
1
Hello,
Which provided should be selected from the drop-down?
I have only Google Drive and Google Cloud Storage, I've tried both of them without success.

Regards,
Kalin
 

Sketch89

Cadet
Joined
Jan 9, 2023
Messages
2
Followed the guide, in the end says credentials are valid. Once the backup ran and succeeded till 89%. Now it wont start again, stays at 0% Anyone else experiencing this problem?
 

jbarranco

Dabbler
Joined
Sep 7, 2022
Messages
11
I had a similar experience to Sketch89 - I setup the credentials (clientId, secret, and then obtained the access token), added them to the TrueNas provider method but the sync process showed the same message as before. No progress, transfers, etc.

I should mention that I used option 19 in the rsync config which was the "Google Photos" Option

So I'm in the same boat as just going with the stock Truenas Authentication
 

rPal

Cadet
Joined
Apr 23, 2019
Messages
3
In step 20 I cant copy the link (I try to access TrueNAS from Windows10 and Firefox). The link is broken (in the middle of the client_id), I can't copy the text.

Any ideas?
 

Sketch89

Cadet
Joined
Jan 9, 2023
Messages
2
Currently the sync for me fails because of the following error: "Quota exceeded for quota metric 'All requests' and limit 'All requests per day' of service 'photoslibrary.googleapis.com" anyone else experienced this?
 

brahmy

Dabbler
Joined
Mar 24, 2022
Messages
13
Currently the sync for me fails because of the following error: "Quota exceeded for quota metric 'All requests' and limit 'All requests per day' of service 'photoslibrary.googleapis.com" anyone else experienced this?
Yes I have seen this...

Suggestions:
  • Ensure you are NOT syncing the /by-day folder. This apparently causes a ton of API requests and you're limited to 10k per day. You will quickly blow through this limit. (photos will eventually sync if you leave the task running, every 24h whenever Google resets their API counter)
    • Instead, I am syncing the /media/by-year/2023 and I just have a manual calendar reminder to update this task every year.
  • If you're syncing ALL your photos for the first time and you have a lot - it might take a couple days / API requests to sync them.
  • Ensure your API app is in Production, not Testing mode... or else your credentials will expire after a week. This drove me absolutely batty.
Using the by-year folder sync, I just synced a month or two of photos and this what my API limits look like:

1676668566641.png


In contrast, I have an image on this ticket of hitting 10k API requests per day when trying to sync the by-day folder for my entire Google Photos library history.

Hope this helps

EDIT: Another tip I'd like to share is using healthcheck.io's free cron job monitoring service to troubleshoot this finicky backup system. Easy for this to fail and not provide an alert. Sign up at https://healthchecks.io/, create a check, and add the following post script to the cloud job:

Code:
#!/bin/sh
curl -m 10 --retry 5 ping https://hc-ping.com/PING_URL_GOES_HERE
 
Last edited:

cyberchris

Cadet
Joined
Feb 27, 2023
Messages
1
afer the "Verify Credentials"(step 26) show me "failed to configure Box: invalid character '/' after top-level value" and on the click on more "Failed to create file system for "remote:": failed to configure Box: invalid character '/' after top-level value"

What can I do now?
 

nikkon

Contributor
Joined
Dec 16, 2012
Messages
163
I don't know if for you this worked, for me is not.
 

robiwan

Cadet
Joined
Oct 19, 2018
Messages
1
Anyone know why I don't have Google Photos option in dropdown box? Only Google Drive and Google Cloud?
 

nikkon

Contributor
Joined
Dec 16, 2012
Messages
163
it has been removed in the latest updates.
 

magickarle

Dabbler
Joined
May 5, 2023
Messages
16
Yes I have seen this...

Suggestions:
  • Ensure you are NOT syncing the /by-day folder. This apparently causes a ton of API requests and you're limited to 10k per day. You will quickly blow through this limit. (photos will eventually sync if you leave the task running, every 24h whenever Google resets their API counter)
    • Instead, I am syncing the /media/by-year/2023 and I just have a manual calendar reminder to update this task every year.
  • If you're syncing ALL your photos for the first time and you have a lot - it might take a couple days / API requests to sync them.
  • Ensure your API app is in Production, not Testing mode... or else your credentials will expire after a week. This drove me absolutely batty.
Using the by-year folder sync, I just synced a month or two of photos and this what my API limits look like:

View attachment 63725

In contrast, I have an image on this ticket of hitting 10k API requests per day when trying to sync the by-day folder for my entire Google Photos library history.

Hope this helps

EDIT: Another tip I'd like to share is using healthcheck.io's free cron job monitoring service to troubleshoot this finicky backup system. Easy for this to fail and not provide an alert. Sign up at https://healthchecks.io/, create a check, and add the following post script to the cloud job:

Code:
#!/bin/sh
curl -m 10 --retry 5 ping https://hc-ping.com/PING_URL_GOES_HERE
how are you forcing the /mnt/by-year/
I've tried to input it manually in cloud synch task:
/,/media,/media/by-year/
/media/by-year/

But ASA i save it, it gets replaced by mombo jumbo stuff or i get "All selected directories must be at the same level i.e., must have the same parent directory"
The dry run works om /media/by-year/ ( i see pics being skip because they exsist) but on save it gets replaced by /media/by-year//,/media/by-year//media
 

magickarle

Dabbler
Joined
May 5, 2023
Messages
16
in cli:
[truenas] task cloud_sync> query
+----+---------------------------+-----------+--------------------------------------------------------+------------+---------+---------------+------------+---------------------+---------------------+-----------------+------+-------------+------------+----------+--------------+-----------+--------------+-----------+-----------------------+-----------------+--------+-------------+----------+--------+
| id | description | direction | path | attributes | enabled | transfer_mode | encryption | filename_encryption | encryption_password | encryption_salt | args | post_script | pre_script | snapshot | bwlimit | include | exclude | transfers | create_empty_src_dirs | follow_symlinks | job | credentials | schedule | locked |
+----+---------------------------+-----------+--------------------------------------------------------+------------+---------+---------------+------------+---------------------+---------------------+-----------------+------+-------------+------------+----------+--------------+-----------+--------------+-----------+-----------------------+-----------------+--------+-------------+----------+--------+
| 1 | gphoto_truenas_magickarle | PULL | /mnt/MainPool/pictures/Google Photos Backup/magickarle | <dict> | true | COPY | false | false | | | | | | false | <empty list> | //** | <empty list> | <null> | false | false | <dict> | <dict> | <dict> | false |
| | | | | | | | | | | | | | | | | /media/** | | | | | | | | |
+----+---------------------------+-----------+--------------------------------------------------------+------------+---------+---------------+------------+---------------------+---------------------+-----------------+------+-------------+------------+----------+--------------+-----------+--------------+-----------+-----------------------+-----------------+--------+-------------+----------+--------+

I can see the value /media/** but once i get into update 1, can't
 

drewzoo02

Cadet
Joined
Dec 27, 2022
Messages
1
Yes I have seen this...

Suggestions:
  • Ensure you are NOT syncing the /by-day folder. This apparently causes a ton of API requests and you're limited to 10k per day. You will quickly blow through this limit. (photos will eventually sync if you leave the task running, every 24h whenever Google resets their API counter)
    • Instead, I am syncing the /media/by-year/2023 and I just have a manual calendar reminder to update this task every year.
  • If you're syncing ALL your photos for the first time and you have a lot - it might take a couple days / API requests to sync them.
  • Ensure your API app is in Production, not Testing mode... or else your credentials will expire after a week. This drove me absolutely batty.
Using the by-year folder sync, I just synced a month or two of photos and this what my API limits look like:

View attachment 63725

In contrast, I have an image on this ticket of hitting 10k API requests per day when trying to sync the by-day folder for my entire Google Photos library history.

Hope this helps

EDIT: Another tip I'd like to share is using healthcheck.io's free cron job monitoring service to troubleshoot this finicky backup system. Easy for this to fail and not provide an alert. Sign up at https://healthchecks.io/, create a check, and add the following post script to the cloud job:

Code:
#!/bin/sh
curl -m 10 --retry 5 ping https://hc-ping.com/PING_URL_GOES_HERE
During what step of the post do you turn on production mode? Do you turn it on after creating the credentials or before? Sorry to bother.
 

hedchange

Cadet
Joined
Mar 10, 2023
Messages
4
During what step of the post do you turn on production mode? Do you turn it on after creating the credentials or before? Sorry to bother.
I turned mine on after I did the rclone setup and setup credentials on truena. Confirmed that they were valid and ran a test. I set mine up originally about 8 days ago and forogt to put it back on production due having to reset up a few things and test the user. Even when going to production it would not sync so I remade the credentials and edited my rclone config with the new client if and secret and the new token on the backup sync. Good luck..
 

hedchange

Cadet
Joined
Mar 10, 2023
Messages
4
how are you forcing the /mnt/by-year/
I've tried to input it manually in cloud synch task:
/,/media,/media/by-year/
/media/by-year/

But ASA i save it, it gets replaced by mombo jumbo stuff or i get "All selected directories must be at the same level i.e., must have the same parent directory"
The dry run works om /media/by-year/ ( i see pics being skip because they exsist) but on save it gets replaced by /media/by-year//,/media/by-year//media
Did you figure this out? I am running out of All request and I am trying to not sync by day. I am getting the same thing you are getting, but I cannot even complete a dry run let alone save it. Thanks
 

hedchange

Cadet
Joined
Mar 10, 2023
Messages
4
Did you figure this out? I am running out of All request and I am trying to not sync by day. I am getting the same thing you are getting, but I cannot even complete a dry run let alone save it. Thanks
how are you forcing the /mnt/by-year/
I've tried to input it manually in cloud synch task:
/,/media,/media/by-year/
/media/by-year/

But ASA i save it, it gets replaced by mombo jumbo stuff or i get "All selected directories must be at the same level i.e., must have the same parent directory"
The dry run works om /media/by-year/ ( i see pics being skip because they exsist) but on save it gets replaced by /media/by-year//,/media/by-year//media
I have put some folders in the exclude in the section.

1687109922739.png


I have not gotten change to run it yet as I am out of requests. I did however select all the folders including root for the photos.
I will follow up and see what it does in the morning when it reruns.
1687110021621.png
 

viniciuscm

Cadet
Joined
Aug 3, 2023
Messages
1
I have put some folders in the exclude in the section.

View attachment 67505

I have not gotten change to run it yet as I am out of requests. I did however select all the folders including root for the photos.
I will follow up and see what it does in the morning when it reruns.
View attachment 67506
I tried to add to the Exclude section, but it does not work....it keeps synching by-day, by-month and by-year. This is bad cause you basically get 3 copies of the same thing consuming space.

Another bummer I found yesterday is that this method compresses photos and videos, looking at the rclone page, this is expected:
1691075460717.png


Is there any alternatives to get this with original quality that can be automated at Truenas?
 

sstruke

Dabbler
Joined
Feb 2, 2017
Messages
37
Yes I have seen this...

Suggestions:
  • Ensure you are NOT syncing the /by-day folder. This apparently causes a ton of API requests and you're limited to 10k per day. You will quickly blow through this limit. (photos will eventually sync if you leave the task running, every 24h whenever Google resets their API counter)
    • Instead, I am syncing the /media/by-year/2023 and I just have a manual calendar reminder to update this task every year.
  • If you're syncing ALL your photos for the first time and you have a lot - it might take a couple days / API requests to sync them.
  • Ensure your API app is in Production, not Testing mode... or else your credentials will expire after a week. This drove me absolutely batty.
Using the by-year folder sync, I just synced a month or two of photos and this what my API limits look like:

View attachment 63725

In contrast, I have an image on this ticket of hitting 10k API requests per day when trying to sync the by-day folder for my entire Google Photos library history.

Hope this helps

EDIT: Another tip I'd like to share is using healthcheck.io's free cron job monitoring service to troubleshoot this finicky backup system. Easy for this to fail and not provide an alert. Sign up at https://healthchecks.io/, create a check, and add the following post script to the cloud job:

Code:
#!/bin/sh
curl -m 10 --retry 5 ping https://hc-ping.com/PING_URL_GOES_HERE
I would need help
When I move the API from testing to production, it tells me that I don't have access when logging in. (Access is blocked: The gogle1234 application request is invalid)
 
Top