• remi_pan@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    4
    ·
    10 hours ago

    If the jailbreak is about enabling the LLM to tell you how to make explosives or drugs, this seems pointless, because I would never trust a IA so prone to hallucinations (and basicaly bad at science) in such dangerous process.