-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add the possibility to save the disk backend thumbnail and full view cache as subfolder in rawfiles folder. #16359
Comments
@s7habo a few questions...
I solved this problem for me by using a Lua script. darktable-generate-cache was too hard to use and wouldn't do all the things I wanted. The background crawler didn't exist, but even if it had it still doesn't do what I want when I want how I want. And, it does fill the disk for people with large collections. My solution runs on import and generates a thumbtable and full preview cache (4K screen, so mipmap size 3 and 6). I don't use filmstrip so I don't bother building cache for it. The script runs in the Lua thread so darktable is fully usable while the cache is generating. It's fast enough that after 1 or 2 seconds you can't scroll fast enough to catch up unless you try. You can open an image in darkroom if you desire and the cache keeps getting generated. Here's the run times for last night's shoot (2 basketball games and a ceremony in between them):
Everything was on an SSD. The images were from a Canon R7 (32MP resolution, 48MB image size) The scale 3 mipmap is the embedded jpg. The full size preview requires opening the file with libraw and generating from that. In addition to running on import, the script can be invoked by shortcut to run on a collection or selection of images. Cache size for 53K+ images is 94MB. My other fix was to update thumbtable images while editing. I edit using the spacebar to advance to the next image. I hate returning to lighttable and waiting for all the thumbnails to update, so I catch the history changed and darkroom image changed events and update the previous images thumbnail in background. This way when I return to lighttable only the last image I was editing needs a new thumbnail. |
That varies. It depends a lot on the job. Sometimes I only have to photograph a short event (e.g. award ceremony) where I have to take 100-200 photos. At a festivity that lasts a whole day I sometimes have more than 1000 photos.
Yes, but only if there is no other way. I find the flexibility of having pre-generated different preview sizes very advantageous to be able to work quickly and effectively. For example, here is a book presentation event with about 400 photos. With such a preview size - when needed - I can find a specific speaker very quickly: Now I can choose a larger preview to look at his facial expressions and better assess which photo from that series will be suitable for processing: At the end, with a full preview, I can estimate where the sharpening is and, for example, in which direction he is looking: Then I process the photo, go back to lighttable, find a few more suitable examples with full preview mode, copy history and apply it to these examples. I may have to make a quick correction to some of them. I give them three stars and they are ready for export. At the end of processing, I only display the photos with three stars, mark them and let darktable export them. The whole process for all the photos from this event takes no longer than 10 - 20 minutes. Editing takes much longer if the thumbnails and full previews have to be generated in parallel.
For me it is not a problem if I have to wait longer when importing raw files until also all the preview images have been generated. As far as I'm concerned, it can take an additional quarter of an hour or longer if needed. I can do something else in that time. The same applies to export. The duration of the generating process can also be represented by a duration bar, just like the import or export process. The problem is when the editing itself is unnecessarily prolonged by additional tasks that darktable has to do in parallel.
Copying generated images back to darktable mipmap folder doesn't make much sense, because that's what causes the problem with limited disk space. Even if the access to the external hard disk is slower, I think it will be advantageous to have image mipmap folder in the folder with raw files, because once created, darktable only has to fetch them from there and not generate them anymore. This can also be an advantage if the photos have to be processed later using a different computer.
I use Nikon D850 full frame camera and raw files are twice as big as yours, around 90MB |
What about set the database and cache directory on external drive ? you switch from project and so switch from database/cache ? Feeling that a profile manager would be interesting to darktable. Would be interested by the lua script to generate cache. I use the export to populate the 4 and image magick to generate the 3 size but only of a selection. pretty fast. |
This issue has been marked as stale due to inactivity for the last 60 days. It will be automatically closed in 300 days if no update occurs. Please check if the master branch has fixed it and report again or close the issue. |
Where can I find your LUA script? |
This issue has been marked as stale due to inactivity for the last 60 days. It will be automatically closed in 300 days if no update occurs. Please check if the master branch has fixed it and report again or close the issue. |
@s7habo I may have a way to do this. Even though you have your images scattered across multiple hard drives, are all of your images in the same library and using the same cache directory? My thought is this. You import your images the first time. The cache gets generated and updated as you edit your photos. So after you're done processing you could run a script to save the cache for those images as a subdirectory in your images directory. Once the files are moved, we could make the cache functional using symbolic links that point from the cached image name to the corresponding image in the sub directory. That way the darktable cache directory is just a bunch of symlinks which doesn't take up much space at all. EDIT: This won't work with windows since symbolic links are a problem. |
Yes.
Yes, that would be an option
Yes, that would be an option I use Linux, so that won't be a problem for me |
My internal hard drive has very limited capacity and with the disk backend options turned on it fills up very quickly especially if you have a large database of raw files.
On the other hand, these options are very useful if you have a large number of directories with raw files and need to access them quickly and regularly to do post-processing of images when needed.
For example, I often get requests from my clients to additionally process a certain selection of photos for a certain purpose some time later, and if the disc backend option is not switched on, it can take a long time to (re)generate thumbnails and full view, especially for directories with a large number of raw files, which sometimes slows down the editing process enormously.
It is important to emphasize that the photos I produce on a daily basis are stored on various external hard drives. Accordingly, it would be beneficial to also have the thumbnails and full view previews for the purpose of quick access, in the sub-folders of the respective directories of raw files.
Describe the solution you'd like
The text was updated successfully, but these errors were encountered: