• turmacar@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    3 days ago

    It would be interesting to have encrypted blobs scattered around volunteer computers/servers, like a storage version of BOINC / @HOME.

    People tend to have dramatically less spare storage space than space compute time though and it would need to be very redundant to be guaranteed not to lose data.

    • douglasg14b@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      Oh for sure, that’s quite reasonable, though at some point you just move towards re-creating BitTorrent, which will be the actual effect you want.

      You could build an appliance on top of the protocol that enables the distributed storage, that might actually be pretty reasonable 🤔

      Ofc you will need your own protocols to break the data up into manageable parts, chunked in a same way, and make it capable of being removed from the network or at least made inaccessible for dmca claims. Things that is completely preventing the internet archive from being too much of a target from government entities.

      • turmacar@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        Yea some kind of fork of the torrent protocol where you can advertise “I have X amount of space to donate” and there’s a mechanism to give you the most endangered bytes on the network maybe. Would need to be a lot more granular than torrents to account for the vast majority of nodes not wanting or being capable of getting to “100%”.

        I don’t think the technical aspects are insurmountable, and there’s at least some measure of a builtin audience in that a lot of people run archiveteam warrior containers/VMs. But storage is just so many orders of magnitude more expensive than letting a little cpu/bandwidth limited process run in the background. I don’t know that enough people would be willing/able to donate enough to make it viable?

        ~70 000 data hoarders volunteering 1TB each to be a 1-1 backup of the current archive.org isn’t a small number of people, and that’s only to get a single parity copy. But it also isn’t an outrageously large number of people.

        • douglasg14b@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          You might not necessarily have to fork BitTorrent and instead if you have your own protocol for grouping and breaking the data into manageable chunks of a particular size and each one of those represents an actual full torrent. Then you won’t necessarily have to worry about completion levels on those torrents and you can rely on the protocol to do its thing.

          Instead of trying to modify the protocol modify the process that you wish to use protocol with.