Facial-recognition data is typically used to prompt more vending machine sales.

  • cmnybo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    168
    ·
    7 months ago

    Why the hell does a vending machine need a facial recognition camera to “activate the purchasing interface”?

    There should just be a set of buttons to select what you want and a window so you can see what items are available.

    • Aatube@kbin.social
      link
      fedilink
      arrow-up
      98
      ·
      7 months ago

      Stanley sounded alarm after consulting Invenda sales brochures that promised “the machines are capable of sending estimated ages and genders” of every person who used the machines without ever requesting consent.

      • NatakuNox@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        ·
        edit-2
        7 months ago

        Yup it’s for “advertising” say for example the Army wants to know which areas have the most fighting aged men. So posters and recruiters know where to hang out. (this is the most extreme example.)

    • LesserAbe@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      7 months ago

      I saw some posts about a similar technology in the meetings and events industry: a company is selling “facial analysis” not “facial recognition.” They try to get around privacy laws by saying “well our technology does scan every single face it sees, but it doesn’t store that image, it just determines age, gender, race and emotional sentiment and adjusts tallies for those categories in a database.”

      It’s still information gathering I didn’t consent to while attending a conference, and it’s a camera with the potential to be hacked.

      Of course it’s always about marketing and advertising. They want to have a heat map of which areas are popular and at what times. In the case of events so they can sell to sponsors and exhibitors. In this university it’s less clear. Do the vending machines have a space to sell ads? That would be my guess.

    • magnetosphere@kbin.social
      link
      fedilink
      arrow-up
      13
      ·
      7 months ago

      Because people are dumb. If the machine knows when someone is looking at it, it can stop doing whatever it does to try and get your attention, and put itself in “sales mode”.

      Still, you’re right. It seems like an overly complicated and expensive solution. Old-fashioned vending machines did the job just fine.

  • _number8_@lemmy.world
    link
    fedilink
    English
    arrow-up
    146
    arrow-down
    3
    ·
    7 months ago

    why do people think it’s okay to do this shit? if you’re coding facial recognition for a vending machine, that’s like 80 steps too far down the capitalism ladder

    if you took this machine back to the 1920s and told people what it was doing, they’d shoot at it. and probably you

    • random9@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      3
      ·
      edit-2
      7 months ago

      80 steps too far down the capitalism ladder

      This is the result of capitalism - corporations (aka the rich selfish assholes running them) will always attempt to do horrible things to earn more money, so long as they can get away with it, and only perhaps pay relatively small fines. The people who did this face no jailtime, face no real consequences - this is what unregulated capitalism brings. Corporations should not have rights or protect the people who run them - the people who run them need to face prison and personal consequences. (edited for spelling and missing word)

    • Kornblumenratte@feddit.de
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      7 months ago

      In the article is a sound explanation: the machine is activated by detecting a human face looking at the display.

      If this face recognition software only decides “face” or “not face” and does not store any data, I’m pretty sure this setup will be compatible with any data protection law.

      OTOH they claim that these machines provide statistics about age and gender of customers. So they are obviously recognising more than just “face yes”. Still – if the data stored is just a statistics on age and gender and no personalised data, I’m pretty sure it still complies even with 1920s data protection habits.

      I’m pretty sure that this would be GDPR conform, too, as long as the customer is informed, e.g. by including this info in the terms of service.

      • yuriy@lemmy.world
        link
        fedilink
        English
        arrow-up
        20
        ·
        7 months ago

        If I need to accept a TOS to use a vending machine, I don’t need to use that vending machine.

        • slumberlust@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          7 months ago

          Fear not, you agree to car ToS if you get in it as a passenger! Not sure how enforcable that is,but the fact they try is gross enough.

        • Kornblumenratte@feddit.de
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 months ago

          I don’t know about the US, but in Germany, by using a vending machine, you are implicitely and automatically consenting with the ToS of the vendor by your action.

  • ChaoticNeutralCzech@feddit.de
    link
    fedilink
    English
    arrow-up
    98
    arrow-down
    2
    ·
    7 months ago

    The students should get together and jack the machine away into their hacking club and do some reverse engineering, so that we get more information on how the data collection worked as opposed to just trusting the company’s statements. If a hacking group like the German Chaos Computer Club got behind this, they could release their findings while keeping the perpetrators anonymous. However, I’m pretty sure the machine is just a frontend to a server, which got shut down as soon as the students complained, with no GDPR-like checkout being available in the jurisdiction.

      • KairuByte@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        7 months ago

        No only was a person behind the decision, a person was also behind the dissemination of the requirements, the implementation of the change, the design of the hardware, and all steps in between.

    • Hamartiogonic@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      14
      ·
      7 months ago

      When you start tinkering with a machine learning model of any kind, you’re probably going to find some interesting edge cases the model can’t handle correctly. Maybe there’s a specific face that has an unexpected effect on the device. What if you could find a way to cheese a discount out of it or something?

      • redcalcium@lemmy.institute
        link
        fedilink
        English
        arrow-up
        17
        ·
        7 months ago

        Imagine a racist vending machine. The face recognition system think this customer is black with 81% confidence. Let’s increase the price of grape soda! Oh look, a 32 year old white woman (79% confidence). Better raise the price of diet coke!

        • SpaceCowboy@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          7 months ago

          In Japan they had some kind of facial recognition on vending machines selling cigarettes that would determine the age of the person in attempt to prevent kids from buying cigarettes. But it only worked for Japanese people.

          Stupid racist vending machine wouldn’t sell me smokes!

            • SpaceCowboy@lemmy.ca
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              7 months ago

              It’s cool, I quit years ago.

              Also I was in a diverse group of people and we were able to do some science. Fortunately we had a Japanese person in the group which allowed me to purchase the smokes. But yeah, it failed on everyone that wasn’t Japanese.

        • Hamartiogonic@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          When you use a generated face with a mixture of white and black features, that’s when it gets interesting. Maybe you can even cause an integer overflow.

      • ChaoticNeutralCzech@feddit.de
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        7 months ago

        I don’t think they’re doing dynamic pricing on an individual basis, that would be too obvious. But checking the demographics of each location or individuals’ shopping habits, and potentially adjusting the prices or offerings? Definitely.

        • Hamartiogonic@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          7 months ago

          So, if you show it 100 faces from group A and 4 faces from group B, that could start gradually shifting the prices in a specific direction. If you keep going, you might be able to make it do something funny like charging 0.1 € for a Pepsi and 1000 € for a Coke or something like that. If the devs saw that coming, they might have set some limits so that the price can’s spiral totally out of control.

          • ChaoticNeutralCzech@feddit.de
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 months ago

            I am sure the profit margin is taken into account, so you won’t get an ultracheap Pepsi unless it expires soon. Similarly, I expect it to consider economic viability, so it won’t keep raising prices unless people are willing to pay them. Of course, you never know what the model actually does or what goals it follows (maximizing profit is a good guess, though), or how bad the coding is. The program might be very versatile and robust, or it may break when you show it a QR code - how can I know? Probably something in between.

  • Tristaniopsis@aussie.zone
    link
    fedilink
    English
    arrow-up
    63
    arrow-down
    3
    ·
    7 months ago

    “Where Cadillac Fairview was ultimately forced to delete the entire database, “

    LOL yeah right.

    “OK BUBBA! WE DONE DEE-LEETED THE ENTIRE THANG!!”

    Bollocks.

    They probably gave the ‘enforcement’ agency a blank hard drive and said “Well, gee, shucks. That’s all we had!”

  • theodewere@kbin.social
    link
    fedilink
    arrow-up
    60
    ·
    7 months ago

    “over 5 million nonconsenting Canadians” were scanned into Cadillac Fairview’s database

    fully scanned facially by automated kiosks in malls… the database was deleted only after an investigation…

  • db2@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    3
    ·
    7 months ago

    To the people that allowed that gross invasion to happen:

    Oopsie woopsie, diddums make a widdle fucky wucky? Yes you did. Yes you did.

    Then do what you’d do to any other child: take away the toy they misbehaved with.

  • Kissaki@feddit.de
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    7
    ·
    7 months ago

    “facial recognition exe” doesn’t say anything about a “face image database” as this post title claims.

    • teamevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      3
      ·
      7 months ago

      What the hell else could they be doing with the data? Scanning a face without a database is absolutely pointless.

      • Kissaki@feddit.de
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        8
        ·
        edit-2
        7 months ago

        The linked article tells you: Recognize when someone stands in front of the vending machine.

        “the data” is interpreted. Not stored or matched.

        • uis@lemm.ee
          link
          fedilink
          English
          arrow-up
          17
          arrow-down
          1
          ·
          7 months ago

          “Why do you need fingerprint reader?”

          “To recognize when someone touches the vending machine.”

        • CaptPretentious@lemmy.world
          link
          fedilink
          English
          arrow-up
          16
          ·
          edit-2
          7 months ago

          Sure that’s their claim but they’re not asking ‘why have that type of tech anyways’.

          If it’s supposed to just act as a motion sensor, we’ve had those for decades. None of which needed to register if it was a face or not. Why isn’t the purchasing interface just always there, why is it an interface, and why is it not just a button that says press to start…

          Why is there a computer in there that’s been trained on how to recognize what a face is in order to open up a purchasing interface. What would be the point of investing that much research and development if it was just doing something that could have been accomplished in the '90s with tech that you could have bought it radio shack.

          • brianorca@lemmy.world
            link
            fedilink
            English
            arrow-up
            8
            arrow-down
            1
            ·
            7 months ago

            Article says “the machines are capable of sending estimated ages and genders” so it’s not recognizing individuals, but perhaps adjusting the sales pitch for who it sees walking by.

            (But it’s a collage campus, so most students will be around the same age. Maybe it pitches different things to teachers?)

          • AngryishHumanoid@reddthat.com
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            7 months ago

            From the side of someone who works heavily in data analysis and application databases I can tell you it would be very, very easy to see if it was just a front end application using the data or storing it in a database. There are use cases for both setups, absolutely, but a cursory examination of the machine in question would make it abundantly clear which it was doing.

  • devilish666@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    7 months ago

    Hmm… facial recognition vending machine huh…
    Finally it’s time for my jammer & some script from c/netsec to shine

  • pHr34kY@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    4
    ·
    edit-2
    7 months ago

    I’d doubt it’s collecting or transmitting much. It’s probably just estimating age, sex, race etc. and using it to decide which promotion to put on screen. It’s possibly collecting these to determine what type of people use the machine. Similar to those billboards in shopping centres.

    Storing each individual to recognize later or identify online seems like a stretch.

    If it did have a user bio database, it would be centralised and not on the machine itself.

    • doctorcrimson@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      7 months ago

      I think the problem is that it is storing the user faces, at all. If it were simple identifying each person’s characteristics there would be no reason to save that data for later. Also, apparently the company advertises that the machine does transmit this data for estimating age and gender for every purchase.

      • dev_null@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        7 months ago

        That’s your claim though. They are storing “male, 24” and that’s it, no face. Of course they could be lying and actually are storing faces, but it doesn’t look like it. And it’s also perfectly valid to object to them storing even “male, 24”.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    7 months ago

    This is the best summary I could come up with:


    The Reddit post sparked an investigation from a fourth-year student named River Stanley, who was writing for a university publication called MathNEWS.

    Where Cadillac Fairview was ultimately forced to delete the entire database, Stanley wrote that consequences for collecting similarly sensitive facial recognition data without consent for Invenda clients like Mars remain unclear.

    Stanley’s report ended with a call for students to demand that the university “bar facial recognition vending machines from campus.”

    Some students claimed on Reddit that they attempted to cover the vending machine cameras while waiting for the school to respond, using gum or Post-it notes.

    The technology acts as a motion sensor that detects faces, so the machine knows when to activate the purchasing interface—never taking or storing images of customers."

    It was only after closing a $7 million funding round, including deals with Mars and other major clients like Coca-Cola, that Invenda could push for expansive global growth that seemingly vastly expands its smart vending machines’ data collection and surveillance opportunities.


    The original article contains 806 words, the summary contains 166 words. Saved 79%. I’m a bot and I’m open source!

  • Snapz@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    7 months ago

    Those any combination coca cola machines have cameras on them.

  • ___@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    23
    ·
    edit-2
    7 months ago

    I’ll play devil’s advocate. The machine recorded estimated age and gender. Assuming it tracked statistics and didn’t store images, what is the real harm? Future candy will have different designs after they found most users were 70yr old grandpas?

    It is anonymized PII data collected without explicit consent, sure, but don’t blow it out of proportion. There is no big surveillance state plot here (yet), just an overzealous marketing team.

    • discount_door_garlic@lemmy.world
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      1
      ·
      7 months ago

      Not everybody who approaches the machine or walks past it is really consenting to their appearance being logged and analysed though - not to mention that “we don’t store data” is only true if the security is effective and no exploits manage to weaponise the camera now staring back at you as you try to make a purchase.

      Ultimately vending machines are completely passive sales anyway, the collection of demographic data about who is buying from the machine are a little useless because it’s not like the machine can work on its closing techniques for coin based candy sales.

    • tabular@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      ·
      edit-2
      7 months ago

      If you don’t have access to the source code then you don’t know what it’s doing. If there’s economic incentives to take my picture and tie my face to my name then I’m going to assume “trust us, it’s anonymous” means “we buy and sell your data” (at least).

      If you’ll grant there are people in power who would want a surveillance state and businesses routinely sell data to governments then you don’t get to dismiss this out of hand. We have to draw the line somewhere, even if marketing people with a stalker mentally don’t see the line.