-
checksum in archival workflow?
I’m trying help a colleague do a fairly large archiving/digitizing project for a career’s worth of work. As producers go he is pretty technical but not a hands-on techie per se. We can use Shotput Pro to copy all files and provide checksum verification (and a checksum value for future reference) every time an asset is copied. I don’t think his scale and needs warrant LTO tape.
Most likely digitized NTSC “master” files will get stored on a minimum of two drives, each in a different location. Good quality access copies aka screeners will get stored in the cloud. Once we accomplish this, it’s hard to know what my parting words (or memo) should be about re-verification and auditing of his archives should be?
I know there are rules of thumb that files should get migrated to new drives every so many years. But trickier still for me is to suggest a e-verification regime for a non-techie, suggestions?
For years I have used CDFinder (now Neofinder) to ride herd on file collectors spanning many hard drives. I don’t have the latest Neofinder but it offers FileCheck that seems to address this issue. (see below). This seems like it might be a great fit but would love to hear from others.
Thanks in advance for any suggestions.
Paul
https://www.cdfinder.de/en/en/filecheck.html
If you verify the FileCheck values for an entire catalog, CDFinder will even show you a window containing all files who did NOT pass the check, so you know exactly which files are damaged and need to be replaced. Of course, CDFinder also displays the actual MD5 value for every file in the Inspector: