SOLVED MoVing Files from the command line

ianwood

Dabbler
Joined
Sep 27, 2021
Messages
14
Read through a bunch of threads on this but it doesn't seem totally clear what the best practice is. I work with the Cinema DNG video format which consists of thousands of RAW files. As a result, I move a lot of small files around quite a bit. I have scripts that I use to make moving footage around easier. For example:

Code:
mv H238_C001_20190702_R1/H238_C001_20190702_R{01766..02023}.dng source/kyoto-gardens-02


This moves a segment of footage from the original folder to a source clip folder that has already been created. This has worked very well before moving to the NAS. If I do the above from a local terminal via the SMB share, it is slow and inefficient, but works. If I use SSH and issue the command on the TrueNAS server, it is hit or miss and I cannot make heads or tails of it.

Sometimes it works fine and the only issue is that SMB shares don't see the files are moved right away.

Other times and I cannot figure out why, it just doesn't want to execute at all and returns this:

Code:
mv: rename H238_C001_20190702_R1/H238_C001_20190702_R{01766..02023}.dng to source/kyoto-gardens-02/H238_C001_20190702_R{01766..02023}.dng: No such file or directory


Just to test permissions and folders, if I remove the range expression and issue this command to move just one file, it works fine:

Code:
mv H238_C001_20190702_R1/H238_C001_20190702_R01766.dng source/kyoto-gardens-02


I am a bit stumped. Anyone have any ideas? Why would it work sometimes? Ideally, I would like to move the files via SMB and avoid using SSH but it seems TrueNAS SMB gets bogged down when there are lots of little files (e.g. deleting lots of little files over SMB to TrueNAS takes a loooooong time to complete).
 

Patrick M. Hausen

Hall of Famer
Joined
Nov 25, 2013
Messages
7,776
H238_C001_20190702_R{01766..02023}.dng
I have never seen any filename wildcard pattern like this. What operating system and what shell is this?

If you use shell commands in an interactive session via ssh, you are bound to standard Unix "glob" patterns:
 

Samuel Tai

Never underestimate your own stupidity
Moderator
Joined
Apr 24, 2020
Messages
5,399
Root by default runs CSH, so you have to use CSH glob patterns to refer to multiple files. CSH doesn't have the concept of using .. to refer to ranges between {}. Something like this would work: mv H238_C001_20190702_R1/H238_C001_20190702_R*.dng source/kyoto-gardens-02.
 

ianwood

Dabbler
Joined
Sep 27, 2021
Messages
14
It works on bash / zsh on OSX really well. And I *thought* I had it working using SSH into TrueNAS but maybe I imagined that.

If I have to use glob, I am limited to wildcards per digit so guess I have to iterate through the numerical range to get a similar result. That's no fun.

Alternatively, maybe future versions of TrueNAS get faster when moving / deleting lots of small files over SMB.
 

ianwood

Dabbler
Joined
Sep 27, 2021
Messages
14
Root by default runs CSH, so you have to use CSH glob patterns to refer to multiple files. CSH doesn't have the concept of using .. to refer to ranges between {}. Something like this would work: mv H238_C001_20190702_R1/H238_C001_20190702_R*.dng source/kyoto-gardens-02.

The issue is that I have 25,000 files in one folder and I am using the mv command to extract specific portions of it at a time. Below is a complete set of instructions that is put through a script via an EDL file and translated into time code and corresponding frames (files) of source footage.

Code:
mv H238_C001_20190702_R1/H238_C001_20190702_R{00209..00586}.dng source/kyoto-gardens-01
mv H238_C001_20190702_R1/H238_C001_20190702_R{01766..02023}.dng source/kyoto-gardens-02
mv H238_C001_20190702_R1/H238_C001_20190702_R{02873..03401}.dng source/kyoto-gardens-03
mv H238_C001_20190702_R1/H238_C001_20190702_R{05687..05978}.dng source/kyoto-gardens-04
mv H238_C001_20190702_R1/H238_C001_20190702_R{11646..11978}.dng source/kyoto-gardens-05
mv H238_C001_20190702_R1/H238_C001_20190702_R{12545..13023}.dng source/kyoto-gardens-06
mv H238_C001_20190702_R1/H238_C001_20190702_R{14706..15239}.dng source/kyoto-gardens-07
mv H238_C001_20190702_R1/H238_C001_20190702_R{18190..18561}.dng source/kyoto-gardens-08
mv H238_C001_20190702_R1/H238_C001_20190702_R{19931..20290}.dng source/kyoto-gardens-09
mv H238_C001_20190702_R1/H238_C001_20190702_R{22230..22786}.dng source/kyoto-gardens-10
mv H238_C001_20190702_R1/H238_C001_20190702_R{23359..23810}.dng source/kyoto-gardens-11
mv H238_C001_20190702_R1/H238_C001_20190702_R{24506..25032}.dng source/kyoto-gardens-12
mv H238_C001_20190702_R1/H238_C001_20190702_R{25530..25884}.dng source/kyoto-gardens-13
 

Samuel Tai

Never underestimate your own stupidity
Moderator
Joined
Apr 24, 2020
Messages
5,399
Alternatively, try launching /bin/zsh before running your mv, and then exiting out once you're done. You could also try changing root's shell to zsh in the GUI.
 

ianwood

Dabbler
Joined
Sep 27, 2021
Messages
14
Alternatively, try launching /bin/zsh before running your mv, and then exiting out once you're done. You could also try changing root's shell to zsh in the GUI.
Thanks. I will try that.
 
Last edited:
Top