Body camera video equivalent to 25 million copies of “Barbie” is collected but rarely reviewed. Some cities are looking to new technology to examine this stockpile of footage to identify problematic officers and patterns of behavior.
ITT: People who are scared of things they don’t understand, which in this case is AI.
In this case, the “AI” program is nothing more than pattern recognition software setting a timestamp where it believes there’s something to be looked at. Then an officer can take a look.
It saves so much time, and it filters out anything irrelevant. But be careful because it’s labelled “AI”. Scarry.
EDIT: Comments to this comment confirms that you don’t understand AI, because if you did, you’d know that this system who scans video is not a LLM (large language model). It’s not even the same system in its core.
ITT: People who are scared of things they don’t understand, which in this case is AI.
In this case, the “AI” program is nothing more than pattern recognition software setting a timestamp where it believes there’s something to be looked at. Then an officer can take a look.
It saves so much time, and it filters out anything irrelevant. But be careful because it’s labelled “AI”. Scarry.
EDIT: Comments to this comment confirms that you don’t understand AI, because if you did, you’d know that this system who scans video is not a LLM (large language model). It’s not even the same system in its core.
It’s also potentially skipping some of the parts that should be looked at. It depends on the training set.
It’s not that AI is scary, it’s that AI is dumb as fuck.