Locking local path '/mnt/tank' for reading

robinmorgan

Dabbler
Joined
Jan 8, 2020
Messages
36
Hello all!

Running a Cloud Sync Task for S3 seems to go nowhere. I have tried created new datasets with clean permissions.

Any suggesting?
 

Attachments

  • Screenshot 2020-07-16 at 16.12.09.png
    Screenshot 2020-07-16 at 16.12.09.png
    62.9 KB · Views: 227

Samuel Tai

Never underestimate your own stupidity
Moderator
Joined
Apr 24, 2020
Messages
5,399
How deeply are these datasets nested? I don't experience these locks during S3 sync with a dataset at the root level. Remember, you'll need to look at the permissions not just at the level you want to replicate, but in all parent datasets as well.
 

robinmorgan

Dabbler
Joined
Jan 8, 2020
Messages
36
How deeply are these datasets nested? I don't experience these locks during S3 sync with a dataset at the root level. Remember, you'll need to look at the permissions not just at the level you want to replicate, but in all parent datasets as well.

Great question, thank you, they are 3 layers deep. I have created new test set and I have the same issue.
'/mnt/tank1/test2'
 

Samuel Tai

Never underestimate your own stupidity
Moderator
Joined
Apr 24, 2020
Messages
5,399
How do you have your cloud sync defined? Are you using the new v2 AWS signatures in your cloud credentials?
 

Samuel Tai

Never underestimate your own stupidity
Moderator
Joined
Apr 24, 2020
Messages
5,399
Under System->Cloud Credentials, did you define things like this?

1594926825482.png


Did you verify your credential by clicking the button?
 

Samuel Tai

Never underestimate your own stupidity
Moderator
Joined
Apr 24, 2020
Messages
5,399
OK, then how do you have the cloud sync task defined?
 

robinmorgan

Dabbler
Joined
Jan 8, 2020
Messages
36
Interesting. The job ran today, the only difference is my B2 Task had finished. Can you run two tasks at once?
 

Samuel Tai

Never underestimate your own stupidity
Moderator
Joined
Apr 24, 2020
Messages
5,399
In principle, it should be possible to run more than 2 cloud syncs at once, but looking at /usr/local/lib/python3.7/site-packages/middlewared/plugins/cloud_sync.py, which implements the job, I see some implementation details that may prevent more than one sync from running:
  • Line 136 starts logic that generates a temporary config file for the rclone job. I don't know if rclone can deal with multiple configs in memory.
  • Line 1001 generates the "Locking local path" message during the job, and if another job is running in the same path, the other job's lock will prevent this job from gaining the lock until the other job completes.
 

robinmorgan

Dabbler
Joined
Jan 8, 2020
Messages
36
Great work, thank you for your help. The B2 push and the AWS Push were on the separate datasets. I'm moving everything to AWS deep so this shouldn't be an issue in the future! Thank you again!
 

Samuel Tai

Never underestimate your own stupidity
Moderator
Joined
Apr 24, 2020
Messages
5,399
You're welcome. It's also possible that you may not have verified the credential at the beginning, and the sync started working once the credential was verified.
 
Top