• sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    3 months ago

    The only solution I can think of here is cryptographic signatures. That would prove:

    • the device/software in question stamped the video
    • the video was unaltered since it was stamped

    Individuals can also stamp their own videos so people can decide whether to trust it. Then players like YouTube, PeerTube, etc could display the stamping information.

    • Buttons@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      3 months ago

      This is low hanging fruit and should happen. All devices should cryptographically sign the video and audio they record. It’s not fool proof, a state actor could probably extract the keys and forge a signature, but it would be better than nothing.

      Each device should have its own key. It’s quite difficult to hack a phone, possibly disassembling it, extract the private key from hardware, reassemble the phone, and then forge a signature on fake video. Yeah, it could happen, but if it’s a serious issue the court can inspect the phone the video allegedly came from and at least for normal people, they aren’t going to be able to forge a signed video.

      If we get serious about this devices could have security hardware that is difficult for even state-level actors to break.

      As others have said, people will still believe what they want though. With or without fake videos and even with and without evidence.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        3 months ago

        That’s true, but the more transparent and obvious the evidence against misinformation, the more people will disregard it. You’ll always get the gullible minority that’ll believe in whatever conspiracy, but democracy doesn’t hinge on them, it hinges on the quiet majority.

        That said, this should be done in a privacy respecting way. You should be able to choose not to have your videos signed, not to associate that signature with a given account, and to supply your own signing key. Device resets should reset the key as well. There should also be a mechanism to associate multiple devices as coming from the same source, while distinguishing between devices (so if one device is compromised, only those videos are suspect).

        I think it could be done pretty unobtrusively, and it should start as a standard for journalism before going to the broader public.

    • Starkstruck@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 months ago

      I think one of the biggest issues is even if you come up with a way to verify what is and isn’t AI generated, it might not actually matter. Already we’ve seen people just believing the most obviously fake posts, cause they’re just that gullible. Like yes we should come up with a system to verify things, but I fear the genie is already out of the bottle.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 months ago

        Well, you can’t fix stupid. Fortunately, there are enough not-stupid people that tooling can help fuel a misinformation correction campaign. The more transparent the edits are, the easier it is to fact check.