Dedupe, even with the recent improvements, has huge overheads and will generally degrade in performance as the dataset increases in size, as it needs to keep track of the ‘routing’ table in RAM to redirect the request deduplicated blocks to the actual stored data. Apparently the latest openZFS release reduces the speeds loses over larger datasets, but it’s still subpar compared to compressed data
Video files are already heavily compressed, you’d be better off transcoding it to a more efficient media codec, like X265 or AV1, to save space on video files
You better off enabling compression on a dataset.
Dedupe, even with the recent improvements, has huge overheads and will generally degrade in performance as the dataset increases in size, as it needs to keep track of the ‘routing’ table in RAM to redirect the request deduplicated blocks to the actual stored data. Apparently the latest openZFS release reduces the speeds loses over larger datasets, but it’s still subpar compared to compressed data
Video files are already heavily compressed, you’d be better off transcoding it to a more efficient media codec, like X265 or AV1, to save space on video files