You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The real script is more complicated than this, it has more fanciness, but that's the basic core of what it does.
What I've noticed is that almost of all of the files I search for are located in a handful of directories, perhaps a few dozen or so. These are like, my downloads folder, my folder full of saved pdfs, my folder for saved youtube videos, etc. It's one of those Pareto distribution things: 1% of the directories have 99% of the stuff I care about. But when I run my script, sometimes those directories take a while to appear in the fzf list (several seconds), because fd is traversing the filesystem in an arbitrary order, where the places it visits first are the ones unlikely to have the file I want.
What I would like to do is to give a parameter(s) to fd telling it to visit that directory before doing the full recursive search for everywhere else. Something like:
Is this already possible to do? I could not see anything about it in the manual page. If it's not already possible to do, I would like to request it as a feature.
Pre-emptive notes:
This isn't a matter of fd searching for too much; I already give it a pretty substantial exclude pattern of directories that it shouldn't go into. It's already restricted to visiting only the places that I might want to open something from, in principle. So the ignore list is not the problem here, the problem is the order of traversal.
It also isn't about the order of the printed text output, at least not as such. I know that I can already pipe fd's output through sort or some other program to re-order the list how I want. But that still requires me to wait until fd visits those important directories in the first place (and actually longer, because sort can't be done in streaming fashion). This is about performance/latency, not presentation order.
Also, I don't really know how subdirectories of priority directories should be handled. e.g. suppose a directory layout like this:
and I pass --priority=$HOME/videos --priority=$HOME/pics, it could conceivably list just the roots of the priority directories first, recursing into them later:
I'm not sure which way is the best. For my use-case I think it would be the first one, but I can see how the others could be useful too, in different circumstances.
The way I am handling this at the moment is to run fd several times: first on each of the priority folders, then again on everywhere else, with the priority folders in the exclude list so they aren't visited twice. But this is really awkward to do, and probably not as performant as it could be.
The text was updated successfully, but these errors were encountered:
The way I am handling this at the moment is to run fd several times: first on each of the priority folders, then again on everywhere else, with the priority folders in the exclude list so they aren't visited twice. But this is really awkward to do, and probably not as performant as it could be.
You can make that more performant (and more awkward) by running all the fds in parallel and merging their output:
What version of
fd
are you using?8.7.0
Hi, thanks for making such a great tool. One of the things I use it for is as a simple DIY file search-and-opener:
The real script is more complicated than this, it has more fanciness, but that's the basic core of what it does.
What I've noticed is that almost of all of the files I search for are located in a handful of directories, perhaps a few dozen or so. These are like, my downloads folder, my folder full of saved pdfs, my folder for saved youtube videos, etc. It's one of those Pareto distribution things: 1% of the directories have 99% of the stuff I care about. But when I run my script, sometimes those directories take a while to appear in the fzf list (several seconds), because fd is traversing the filesystem in an arbitrary order, where the places it visits first are the ones unlikely to have the file I want.
What I would like to do is to give a parameter(s) to
fd
telling it to visit that directory before doing the full recursive search for everywhere else. Something like:Is this already possible to do? I could not see anything about it in the manual page. If it's not already possible to do, I would like to request it as a feature.
Pre-emptive notes:
fd
searching for too much; I already give it a pretty substantial exclude pattern of directories that it shouldn't go into. It's already restricted to visiting only the places that I might want to open something from, in principle. So the ignore list is not the problem here, the problem is the order of traversal.sort
or some other program to re-order the list how I want. But that still requires me to wait until fd visits those important directories in the first place (and actually longer, because sort can't be done in streaming fashion). This is about performance/latency, not presentation order.Also, I don't really know how subdirectories of priority directories should be handled. e.g. suppose a directory layout like this:
and I pass
--priority=$HOME/videos --priority=$HOME/pics
, it could conceivably list just the roots of the priority directories first, recursing into them later:or it might first recurse into subdirectories of the priority directories
or it might give the root of the priority directories first, then recurse:
I'm not sure which way is the best. For my use-case I think it would be the first one, but I can see how the others could be useful too, in different circumstances.
The way I am handling this at the moment is to run fd several times: first on each of the priority folders, then again on everywhere else, with the priority folders in the exclude list so they aren't visited twice. But this is really awkward to do, and probably not as performant as it could be.
The text was updated successfully, but these errors were encountered: