• stoy@lemmy.zip
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    2
    ·
    2 years ago

    I am calling bullshit on all of their points.

    1. No screen, but a projector to project on your hand? WTF? So not only will it far less information, but it will be a pain to use…

    2. Voice commands? Meaning I will need to tell everyone around me what I am doing? Also calling bullshit on getting them to work in a busy area.

    3. No it can’t, there are no ways to detect nutrition from a picture of a peice of food

    4. Privacy? Yeah, get back with me in 20 years when it has been proven to not sell, leak or have data stolen, then I’ll be impressed.

    In conclusion, this is as real as the Skarp laser razor is.

    • morrowind@lemmy.ml
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      2 years ago

      No it can’t, there are no ways to detect nutrition from a picture of a peice of food

      Why not? at least to the extent that a human can. Some AI model recognizes the type of food, estimates the amount and calculates nutrition based on that (hopefully verified with actual data, unlike in this demo).

      All three of these functions already exist, all that remains is to put them together.

      • stoy@lemmy.zip
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        3
        ·
        2 years ago

        Ok, if you take any book, keep it closed, how many times do the letters s, q, d and r appear in the book?

        There is no way to know without opening the book and counting, sure, you could make some statisticsl analysis based on the language used, but that doesn’t take into account the font size and spacing, nor the number of pages.

        Since the machine only has a photo to analyze, it can only give extremely generic results, making them effectively useless.

        You would need to open the food up and actually analyze a part of the inside with something like a mass spectrometer to get any useful data.

        • KairuByte@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          1
          ·
          2 years ago

          I agree with you, but disagree with your reasoning.

          If you take 1lb of potatoes, boil and mash them with no other add-ins, you can reasonably estimate the nutritional information through visual inspection alone, assuming you have enough reference to see there is about a pound of potatoes. There are many nutrition apps out there that utilize this, and it’s essentially just lopping off the extremes and averaging out the rest.

          The problem with this is, it’s impossible to accurately guess the recipe, and therefore the ingredients. Take the aforementioned mashed potatoes. You can’t accurately tell what variety of potato was used. Was water added back during the mashing? Butter? Cream cheese? Cheddar? Sour cream? There’s no way to tell visually, assuming uniform mashing, what is in the potatoes.

          Not to mention, the pin sees two pieces of bread on top of each other… what is in the bread? Who the fuck knows!

          • knotthatone@lemmy.one
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 years ago

            It isn’t as magical (or accurate) as it looks. It’s just an extension of how various health tracking apps track food intake. There’s usually just one standard entry in the database for mashed potatoes based on whatever their data source thinks a reasonable default value should be. It doesn’t know if what you’re eating is mostly butter and cheese.

            How useful a vague and not particularly accurate nutrition profile really can be is an open question, but it seems to be a popular feature for smartwatches.

          • stoy@lemmy.zip
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            2 years ago

            I see what you mean, and while you raise a few excellent points, you seem to forget that a human looking at mashed potatoes have far more data than a computer lookkng at an image.

            A human get data about smell, temperature texture and weight in addition to a simple visual impression.

            This is why I picked a book/letter example, I wanted to reduce the variables available to a human to get closer to what a computer has from a photo.

              • stoy@lemmy.zip
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                2 years ago

                But what use would it be then, you wouldn’t be able to compare one potato to another, both would register the same values.

                • adeoxymus@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  2 years ago

                  I think the use case is not people doing potato study but people that want to lose weight and need to know the amount of calories in the piece of cake that’s offered at the office cafeteria.

            • webghost0101@sopuli.xyz
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 years ago

              You are correct but you are speaking for yourself and not for example the disabled community who may lack senses or the capacity to calculate a result. While ai still improves its capabilities they are the first to benefit.

              • stoy@lemmy.zip
                link
                fedilink
                English
                arrow-up
                1
                ·
                2 years ago

                I get what you are saying, but this specific decive had no future.

    • makingStuffForFun@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      2 years ago

      I feel like that with being photographed. Being in a room full of smartphones, and associating me with them in a large database somewhere. Having someone’s kids use tick tock on my home network. Etc etc. We are massively under corporate surveillance, and I despise it.

    • Otter@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      2 years ago

      I’m going to actively avoid people doing so, and I feel like others will as well.

      If someone walks up to you and they’re filming you on their phone, how would most people react?

        • Eggyhead@kbin.social
          link
          fedilink
          arrow-up
          1
          ·
          2 years ago

          I’m just imagining people walking around with smart Iggy Pop badges and having AI conversations with them.

      • ramenshaman@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        2 years ago

        I… I’ve never really watched Star Trek. I think I could get into it but I’ve just never really sat down and watched it. I’ve probably only seen an episode or two in my whole life.

        I’ll see myself out.

  • NeoNachtwaechter@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    2 years ago

    Ai Pin has a “prominent Trust Light” which turns on when the device is in use.

    Let’s all place bets: What will be the first mod or patch for this device? 🥷🏽

  • jordanlund@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    2 years ago

    It’s a neat idea, but frankly, I don’t want or need other people hearing my business. It needs to pair with (smart?) earbuds.

    • phorq@lemmy.ml
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 years ago

      Yeah, this whole product is an exercise in doing things not because it’s more practical than what we already have but simply because we can… for $700 + $24/month… No thanks…

    • dmtalon
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 years ago

      It supports earbuds from watching a random YT video about it that popped up on my feed from the creators

  • maegul (he/they)@lemmy.ml
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    2
    ·
    2 years ago

    For some reason this is making me wish Jobs were still around.

    I’d hope he’d have some subtle burns about this product … maybe about how we’re visual animals and you can’t just throw out decades of progress on screen tech and call that innovation. Maybe something about how we’ve got one voice but 10 fingers and two eyes.

  • kalkulat@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 years ago

    A built-in 13-megapixel ultra wide-angle camera can be used to capture photographs and videos I bet other people won’t like the camera any more than they did with Google Glasses.

    Photos can be viewed using the “Center” website on any web browser. Want to see photos? Gotta go to the website. All photos therefore shared. Along with notes, music listened to, reminders … Nuh-uh!

  • MudMan@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    2 years ago

    I mean, it’s fun that a techbro thought “isn’t the TNG combadge cool?” and actually went and made it, but this was a Youtube video, not a product launch.