How do people deal with long pathnames?

Status
Not open for further replies.

toolforger

Explorer
Joined
Aug 8, 2017
Messages
60
My use case is a home office with Linux and Windows clients that do their backups to FreeNAS.

Situation 1: Windows client has a directory structure where pathnames go close to the 260 character limit. Windows client tries to copy stuff to FreeNAS. FreeNAS means UNC paths, which means C:\xxx becomes \\freenas.local\username\xxx which is ~20 characters longer. So even if the backup works, the user will have his Windows Explorer fail when trying to copy that file (because that one needs to be restored).

Situation 2: Linux limits pathnames to 4096 (in theory, in practice it depends on the filesystem, which means the actual limit can be even higher or much lower).
Now if I happen to have a really deeply nested directory structure, that won't backup to FreeNAS.
In this case, the path of least resistance is probably telling Linux that it shouldn't allow longer filenames. Or warn me about them if they happen - however I'd like the warning within a minute or so, because if I do an hourly backup I don't want that file to miss it just because I forgot to shorten a directory name. I could do a cron job, but I'd like the output on the systray, not in a mail to the root user which may be sitting in my inbox for hours.
Well, obviously my understanding of the Linux case is somewhat incoherent, so any good concepts are very much welcome.

What do you people do?
 

Chris Moore

Hall of Famer
Joined
May 2, 2015
Messages
10,080
I keep my path names short. It saves problems all the way around. Also, I skip the whole username (UNC) thing in my home network because everyone that can access the network has access to all files. Backups are in directories by the name of the computer that is being backed up.
 

styno

Patron
Joined
Apr 11, 2016
Messages
466
Why not use some (backup) software for this? Or zip/tar it to a file. Or use the build-in backup solutions provided by the OS?
 

toolforger

Explorer
Joined
Aug 8, 2017
Messages
60
I keep my path names short. It saves problems all the way around.

I cannot control all path names.
There's people who name their music files with artist, title, and album information.
There's unpacked Eclipse installs with names like ~/projects/delta/eclipse-jee-luna-SR2-linux-gtk-x86_64/features/org.eclipse.jst.jsf.apache.trinidad.tagsupport.feature_2.5.2.v201410101742/META-INF/maven/org.eclipse.webtools.jsf/org.eclipse.jst.jsf.apache.trinidad.tagsupport.feature/pom.properties.

Fortunately, these are all a far cry from the documented 1024-bytes file path limit.
I am more after getting informed about potential problems *before* they make it difficult to access files. I.e. the Windows users here need to know when their pathnames approach the 260-character limit, and the Linux users need to know when their pathnames approach the 1024-character limit. In both cases, they need to know that with some buffer space so they can adapt their naming strategies in time and don't hit a hard limit with no good fall-back strategy.

Also, I skip the whole username (UNC) thing in my home network because everyone that can access the network has access to all files.

Well... people around here are extremely relaxed about sharing their personal workspace - but that's just what they say, and maybe even what they think, but both are not necessarily the same as how it turns out in practice.
I'm not sure that I want to take that kind of social risk.

Backups are in directories by the name of the computer that is being backed up.

I'll probably do a per-user naming scheme with very short user names.
I will have to allow people to organize the subdirectory structures by themselves just to keep acceptance high. I do have a fair chance that nobody will use the .../usr/machine-name/original-file-path scheme but I cannot be sure, so I do not really want to rely on that.
 

toolforger

Explorer
Joined
Aug 8, 2017
Messages
60
Why not use some (backup) software for this? Or zip/tar it to a file. Or use the build-in backup solutions provided by the OS?

The files need to remain accessible via SMB resp. NFS shares.
For a multitude of reasons, which essentially boil down to things being much easier if you can directly use your well-known filesystem tools to hunt down the data that you need.
E.g. you deleted a file three months ago, have only a vague memory of its name, now you need to find the last backup where that file is - with an rsync-style backup that's easy, it's a simple find command in Linux and an Explorer search in Windows, with backup software you usually don't get that kind of functionality and with zip/tar you need to look up unfamiliar commands.
Things get even more difficult if you have to do a content search. Or want to find the backup where a specific change was made to a file.

My current thinking is that a backup operation will be a ZFS snapshot plus an rsync run.
 
Status
Not open for further replies.
Top