SOLVED automatically delete files older than x days

iliak

Contributor
Joined
Dec 18, 2018
Messages
148
hi, i am trying to make a chron job to delete old files from specific folder,
i have tried and validated that this line in work in ubuntu , but it does not work on freenas
the folder sometimes contail lots of files (over 1M most of them very small), normal rm sometimes fail due to size limits, and this command work much faster

Code:
perl -e 'unlink grep { -f and -M >= 7 } glob "/mnt/junk/*/20*"'


any idea what i did wrong?
 
Last edited:

Meyers

Patron
Joined
Nov 16, 2016
Messages
211
Hrmm it could be differences in Perl versions. I would probably just do something like this instead:

Code:
# First test the command like this to verify that it's selecting the right files:
find /mnt/junk/*/20* -type f -mtime +7d -print0 | xargs -0 ls -la

# This will remove them:
find /mnt/junk/*/20* -type f -mtime +7d -print0 | xargs -0 rm -f


This is typically the one liner that gets recommended for deleting a lot of files.

I've tested this minimally so obviously you'll want to test with the first command before running the actual removal.
 

iliak

Contributor
Joined
Dec 18, 2018
Messages
148
only partially working, not all files and folders deleted

running from command line works, but running from cron as root does not work for all files
 
Last edited:

jgreco

Resident Grinch
Joined
May 29, 2011
Messages
18,680
only partially working, not all files and folders deleted

running from command line works, but running from cron as root does not work for all files

Your original post specifically said files. @Meyers provided a shell recipe that specifically deleted files.

I find it somewhat painful and tedious to embed complex commands into crontab, as the debugging environment is less than ideal. You might try creating a script that executes from your pool.

Code:
#! /bin/sh -

(
echo pruner run beginning at `date` 1>&2
# First test the command like this to verify that it's selecting the right files: 
find /mnt/junk/*/20* -type f -mtime +7d -print0 | xargs -0 -t ls -la 

# This will remove them: 
# find /mnt/junk/*/20* -type f -mtime +7d -print0 | xargs -0 -t rm -f
echo pruner run ending at `date` 1>&2
) 2>> /mnt/junk/pruner.err >> /mnt/junk/pruner.out


saved in /mnt/junk/pruner or something like that.

This is basically @Meyers suggestion slightly embellished in a script. This has the added benefit that it could easily be called from the command line, and that it will report in logfiles startup, termination, and actions taken (the "-t" in xargs). Most of your output should end up in pruner.err.
 

iliak

Contributor
Joined
Dec 18, 2018
Messages
148
I implement your method it is much cleaner. and found some issue with foldes. now it working perfectly
thanks
 

fixit9660

Dabbler
Joined
Jan 14, 2018
Messages
33
Code:
#! /bin/sh -

(
echo pruner run beginning at `date` 1>&2
# First test the command like this to verify that it's selecting the right files:
find /mnt/junk/*/20* -type f -mtime +7d -print0 | xargs -0 -t ls -la

# This will remove them:
# find /mnt/junk/*/20* -type f -mtime +7d -print0 | xargs -0 -t rm -f
echo pruner run ending at `date` 1>&2
) 2>> /mnt/junk/pruner.err >> /mnt/junk/pruner.out
Thank you for this, I found this works almost perfectly for my needs, (I just substituted the relevant folders), except that it's finding system files in the root folder of the subfolders folders and files I want to process?

drwxrwxr-x+ 60 GoatCam1 CCTV_Clients 68 Mar 1 00:05 .
drwxrwxr-x+ 3 GoatCam1 CCTV_Clients 5 Jan 6 11:04 ..
-rwxrwxr-x+ 1 GoatCam1 CCTV_Clients 983 Dec 29 16:49 .cshrc
-rwxrwxr-x+ 1 GoatCam1 CCTV_Clients 182 Dec 29 16:49 .login
-rwxrwxr-x+ 1 GoatCam1 CCTV_Clients 91 Dec 29 16:49 .login_conf
-rwxrwxr-x+ 1 GoatCam1 CCTV_Clients 301 Dec 29 16:49 .mail_aliases
-rwxrwxr-x+ 1 GoatCam1 CCTV_Clients 267 Dec 29 16:49 .mailrc
-rwxrwxr-x+ 1 GoatCam1 CCTV_Clients 728 Dec 29 16:49 .profile
-rwxrwxr-x+ 1 GoatCam1 CCTV_Clients 212 Dec 29 16:49 .rhosts
-rwxrwxr-x+ 1 GoatCam1 CCTV_Clients 780 Dec 29 16:49 .shrc

Obviously it's working correctly because these files are older than 7 days.
1) Why is it finding just some of them please? They all have the same creation date.
2) How do I stop it finding these files please, just process the subfolders and their files?
 
Top