Hello, I have a 1 file that I would like to store for a long time (Offline)
I have put it on magnetic tape and in an sealed enclosure, but time will destroy everything, so was wondering if there is a way to like zip the file to make it able to get restored if/when the corruption happens? much like a raid but just for one file

  • jasont80@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    M-Disc with another M-Disc copy stored off-site. Basically, the only way to get 100% instead of 99.9999% for the next 100 years.

  • dr100@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    RAID is for uptime or speed, not for this. It also comes with the dubious feature that you can
    lose
    your
    data
    once more
    without any disk failures

    This question has been asked many times. Have multiple copies, replace bad copies with copies. That’s it.

  • Deckdestroyerz@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Print out the binary code and scan it using OCR later on. Store that in a fireproof safe (/s but would work)

  • dcabines@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    While the par file suggestions are the most reasonable thing to do there are some more interesting alternatives.

    Try horcrux. It’ll split your file into several pieces and make it so you only need a few of the pieces to recreate your file. You could make 99 pieces and only require 3 of them to reassemble. That way if most of them are damaged somehow you can still recreate your file.

    This is similar to how Storj splits your files into 80 pieces and only needs 29 to recreate your data. It is also similar to how satellites transmit data when part of the message can be lost in transmission.

  • lonewolf7002@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    If it’s really, really important I would probably RAR it with a larger % of recovery records, then use PAR files against it, then store it in many, many places. And every once in a while copy it to new places, checking the PAR files for checksum errors.

  • uluqat@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Many copies in many places, with some of the places being as far away as possible - a different country at least, a different continent even better.

  • ThickSourGod@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Par files are a good start, but if you want a truly resilient archive, you need to actively manage it. You need multiple copies in multiple places that you check on a regular schedule.

  • smstnitc@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    If it’s that important, I’d have multiple copies on multiple media, in multiple geographic locations. Like two hard drives and a flash drive, as well as the tape. And in multiple formats. Like raw, and rar’d.

    My point is, anything important, you shouldn’t have only one copy of it. The more important it is, the more copies in separate locations you should have.

    • NiteShdw@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      I have some disk images that need to be bit perfect. I store them on a ZFS RAID AND I made PAR files for additional redundancy, and I keep two copies on two different computers.