New acoustic attack steals data from keystrokes with 95% accuracy::A team of researchers from British universities has trained a deep learning model that can steal data from keyboard keystrokes recorded using a microphone with an accuracy of 95%.

  • Obsession@lemmy.world
    link
    fedilink
    English
    arrow-up
    50
    ·
    1 year ago

    That’s pretty much what the article says. The model needs to be trained on the target keyboard first, so you won’t just have people hacking you through a random zoom call

    • bdonvr@thelemmy.club
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      2
      ·
      1 year ago

      And if you have the access to train such a model, slipping a keylogger onto the machine would be so much easier

      • jumperalex@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        3
        ·
        1 year ago

        Hmmm not totally. A bad actor could record the keyboard and then figure out a way to get it installed. Either through a logistics attack (not everyone maintains a secure supply chain), or an insider threat installing it. Everyone’s trained not to allow thumb drives and the like. But a 100% completely unaltered bog standard keyboard brought into a building is probably easier, and for sure less suspicious if you get caught.

        Sure you might say, “but if you have an insider you’ve already lost” to which I say, your insider is at risk if they do certain things. But once this keyboard is installed, their own detection risk is less.

        Now the question is, how far away can the mic be? Because that’s gonna be suspicious AF getting that installed. BUT!!! this is still a great way to break the air gap.

        • ItsMeSpez@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          A bad actor could record the keyboard and then figure out a way to get it installed

          The room is important to the training of the model as well. So even if you know the make and model of the keyboard, the exact acoustic environment it is in will still require training data.

          Also if you can install a keyboard of your choosing, you can just put the keylogger inside the keyboard. If you’re actually getting your own peripherals installed on your target machine, training a model to acoustically compromise your target is the most difficult option available to you.

          • jumperalex@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            good point about the room.

            as for an installed keylogger, there are organizations that will inspect for that and catch it. My point is this is a way to get an actually unmolested USB device into play.

            But I hear you, this isn’t likely an ideal option right now, but it is an option for maybe some niche case. And these are early days, put enough funding behind it and it might become more viable. Or not. Mostly I’m just offering the thought that there ARE use cases if someone puts even a moment’s creative thought into trade craft and the problems it might solve like breaking the air gap, emplacement, avoiding detection, and data exfil. Each of those are problems to be solved at various levels of difficulty depending on the exact target.