TikTok ran a deepfake ad of an AI MrBeast hawking iPhones for $2 — and it’s the ‘tip of the iceberg’::As AI spreads, it brings new challenges for influencers like MrBeast and platforms like TikTok aiming to police unauthorized advertising.

  • KairuByte@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    46
    arrow-down
    1
    ·
    1 year ago

    So, the first reason is that the law likely already covers most cases where someone is using deepfakes. Using it to sell a product? Fraud. Using it to scam someone? Fraud. Using it to make the person say something they didn’t? Likely falls into libel.

    The second reason is that the current legislation doesn’t even understand how the internet works, is likely amazed by the fact that cell phones exist without the use of magic, and half of them likely have dementia. Good luck getting them to even properly understand the problem, never mind come up with a solution that isn’t terrible.

    • Pxtl@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      The problem is that realistically this kind of tort law is hilariously difficult to enforce.

      Like, 25 years ago we were pirating like mad, and it was illegal! But enforcing it meant suing individual people for piracy, so it was unenforceable.

      Then the DMCA was introduced, which defined how platforms were responsible for policing IP crime. Now every platform heavily automates copyright enforcement.

      Because there, it was big moneybags who were being harmed.

      But somebody trying to empty out everybody’s Gramma’s chequing account with fraud? Nope, no convenient platform enforcement system for that.

      • Ullallulloo@civilloquy.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        1 year ago

        You’re saying that the solution would be to hold TikTok liable in this case for failing to prevent fraud on its platform? In that case, we wouldn’t even really need a new law. Mostly just repealing or adding exceptions to Section 230 would make platforms responsible. That’s not a new solution though. People have been pushing for that for years.

        • Pxtl@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          DMCA wasn’t a blanket “you’re responsible now”, but defined a specific process for “this is how you demand something is taken down and the process the provider must follow”.

        • Petter1@lemm.ee
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          Good luck with that, I guess This company is gone before misterB can finish writing his lawsuit, and with it all the scammed money. But I guess there is some law forcing platforms to not promote scams, I hope, at least in some countries.