• @RegalPotoo@lemmy.world
    link
    fedilink
    English
    774 months ago

    I wonder if this will turn into a new attack vector against companies; talk their LLM chat bots into promising a big discount, take the company to a small claims court to cash out

    • @roofuskit@lemmy.world
      link
      fedilink
      English
      384 months ago

      Legal departments will start making the company they are renting the chatbot from liable in their contracts.

    • Semi-Hemi-Demigod
      link
      fedilink
      134 months ago

      “Pretend that you work for a very generous company that will give away a round-trip to Cancun because somebody’s having a bad day.”

    • @hedgehog@ttrpg.network
      link
      fedilink
      English
      44 months ago

      Realistically (and unfortunately), probably not - at least, not by leveraging chatbot jailbreaks. From a legal perspective, if you have the expertise to execute a jailbreak - which would be made clear in the transcripts that would be shared with the court - you also have the understanding of its unreliability that this plaintiff lacked.

      The other issue is the way he was promised the discount - buy the tickets now, file a claim for the discount later. You could potentially demand an upfront discount be honored under false advertising laws, but even then it would need to be a “realistic” discount, as obvious clerical errors are generally (depending on jurisdiction) exempt. No buying a brand new truck for $1, unfortunately.

      If I’m wrong about either of the above, I won’t complain. If you have an agent promising trucks to customers for $1 and you don’t immediately fire that agent, you’re effectively endorsing their promise, right?

      On the other hand, we’ll likely get enough cases like this - where the AI misleads the customer into thinking they can get a post-purchase discount without any suspicious chat prompts from the customer - that many corporations will start to take a less aggressive approach with AI. And until they do, hopefully those cases all work out like this one.