A.I. company Worldcoin has rolled out 1,500 Orbs to more than 35 cities in a bid to create digital identities for the world’s citizens.

  • NaibofTabr
    link
    1911 months ago

    Hmm, based on the pictures in the article this thing is basically a camera in a shiny ball about 1ft in diameter (it appears to be about the width of 3 bricks laid side by side). It’s not like a Cloud Gate-sized object. To get a scan of your irises you would have to be pretty close to it for at least a few seconds - it’s not like it could get a scan if you’re just walking by a few feet away. You’d have to walk up and point your face at it on purpose. The camera in it also looks fixed - I doubt it can rotate to follow you, that would be mechanically complex, expensive and prone to failure.

    Based on the description, their software takes an image of your irises and reduces it to a hash value. The original image is deleted (they claim) and the hash value is stored as an ID code. It seems likely that the hash value will be unique to their software - e.g. if you wrote your own code to produce hash values from images, you would get a different number even if you had the same picture of the same eyes. So the hash value doesn’t necessarily represent anything about your eyes that would be much of a privacy invasion… It’s just a mathematically derived number string which is unique to their software.

    It’s not clear what part of this system is “AI”, though my guess would be it has something to do with re-identifying your eyes next time you want to access whatever is secured with your hash code. It’s really not clear how that would work… a new image of your eyes collected a year later under different lighting conditions would probably produce a different hash value, so how does this system match them, if it only records the hashes?

    FWIW, I think smashing or spray painting these things, while fun ideas in the rebellious teenager sense, is probably overkill and likely to get you more attention from law enforcement than you want. But, you could probably just walk up behind it and slap a sticker or tape over the camera… they’d still have to pay someone to go out and peel it off.

    • jon
      link
      fedilink
      711 months ago

      Taking a picture instantly after would probably create a different hash value. The thing about hashing is that even if one bit is different between source images, the resulting hashes would look entirely different.

      I suppose I could conceive of a proprietary hash algorithm that would allow for fuzzy matching of iris photos, but as you said, eyes taken years apart in different conditions wouldn’t match the original hash. Or falsely match similar looking eyes. It’s not like this system allows them to get high resolution perfectly lit iris photos, after all.

      The whole thing sounds dubious, and I suspect AI is mentioned solely to secure investor funding, much like how several years back everything mentioned Blockchain.

      • @UFODivebomb@programming.dev
        link
        fedilink
        111 months ago

        They are likely using a form of https://en.wikipedia.org/wiki/Perceptual_hashing

        The noise level a perceptual hash is sensitive to can be tuned.

        The “falsely match similar looking” is harder than one would expect. I used to work on an audio fingerprinting system which was extremely robust to “similar” audio matching. What sounded similar to us was always identified uniquely by the hash with high confidence.

        For example. Take the same piano piece done by the same artists on the same piano performed as close as they could to the same: never confused the perceptual hash with ~10 sec of audio. Not once. We could even identify how much of a pre-recorded song was used in a “live” performance.

        There are adversarial attacks for perceptual hashes. However, “similar eyes” would not be one to a standard perceptual hash. More like: a picture of an abstract puppy happens to have the same hash as an eye.

        I’d be curious on the details of the hash. That is necessary to know what the adversely attacks are. But I see no mention of the details. Which is suspicious on it’s own.

    • @albx79@lemmy.world
      link
      fedilink
      211 months ago

      According to their website, the image is normalised in such a way that taking multiple pictures of the same iris, even under vastly different lighting conditions, will always generate the same hash.

      The data collection may well be nefarious, but this is hatchling spying on people. It’s roughly equivalent to having desks with fingerprint scanners. You won’t be scanned unwittingly and against your will. It’s not even remotely possible, from a technical point of view.