Body camera video equivalent to 25 million copies of “Barbie”
Literally anything but the metric system
it’s called SEO and it’s an art
SEO os the bane of the internet. SEO is why i have to scroll through a novella every time i want to check out a recipe
Oh that’s so fucking annoying, but I also think it’s part of the culture among those who typically submit recipes online.
it may have started that way, but now it’s all SEO
We don’t even need to choose! Just use hours, months, years, decades! But no, Barbie movies.
There is no common metric measure of time.
Edit -common
I’m sorry, can you restate that in terms of the number of ground state transitions of a Cs-136 atom?
The second is one of the 7 fundamental units
https://en.m.wikipedia.org/wiki/International_System_of_Units
Yes it is, but SI is not all metric. Metric is fundamentally a base 10 system. Time is base 60 you can probably thank the ancient Sumerians for that but there’s some debate.
At one point the French tried to make metric time a thing but it didn’t stick.
Short times are always given in scales of 10 for seconds (ms, μs, ns). And long ones can be too.
And machinists in America use decimal inches, but I don’t think anyone would say that inches is metric.
Spelling
Body camera video equivalent of 25 million copies of “Barbie”
Is this a typical unit of measurement in journalism? Like what even is this? Crappy in-article advertising? Some weird SEO shit? An odd attempt to be cool and hip?
It’s America; anything but metric.
And which “metric” unit of time measurement do you prefer?
Well, metric time, obviously:
I prefer seconds since 00:00:00 on Jan 1st, 1970
Ah yes, the metric measurement of time. My favorite.
Your point? It’s still not really a valid comment. Just a braindead joke that’s played out even when it’s actually relevant lol
Yes.
It’s the kind of phrasing you use when you’re paid for how long an article is, but not how good it is.
That sounds like a big investment to find no wrongdoing by officers.
oh great I’m sure the training for this will not result in a bunch of things getting “reviewed” and no one being responsible for mistakes at all…
Sounds like humans, so I guess it’s AI progress? :p
Would you rather these things never be reviewed? Isn’t something better than nothing?
You’ll literally never be able to afford (or hire) enough people to review the data they are taking in…
I mean unless we start killing billionaires and taking their shit.
Yea I share the same concerns about the “AI”, but this sounds like a good thing. It’s going through footage that wasn’t going to be looked at (because there wasn’t a complaint / investigation), and it’s flagging things that should be reviewed. It’s a positive step
What we should look into for this program is
- how the flags are being set, and what kind of interaction will warrant a flag
- what changes are made to training as a result of this data
- how the privacy is being handled, and where the data is going (ex. Don’t use this footage to train some model, especially because not every interaction is out in the public)
Make it publicly accessible. It’ll most certainly get watched and problems will be reported to be investigated further.
Corporations would be delighted to analyze all this footage.
File a complaint, and you get to view the video. If nobody files a complaint, there is no need to view the video.
Indeed, nobody should be looking at the video unless a complaint is filed.
WE should be able to review it/see it ALL.
We pay these fucks to torture and kill with our tax $.
They should have nothing to hide from us.
Well I mean you could rig the cameras to turn on when the cop gets out of their car to break the footage into specific encounters where the cop had to interact with someone. Identify the files by the date, time, and badge number of the cop the camera is assigned to, and now you’ve got an easy to search database of footage whenever an incident is reported either by the cop because they had to issue paperwork for it or by whoever they were interacting with because they want to lodge a complaint.
While randomly selecting files not involved in ongoing investigation as potential training material could be helpful, we don’t actually HAVE to have an assigned review resource to scan for bad behaviour or relevant material to investigations since in both cases someone is incentivized to start the process that will pull the relevant footage anyways.
What if all the cam footage was just uploaded to something like YouTube. Publicly visible by ya know, the very citizens that pay for it and work for…
Wouldn’t that be a huge privacy issue?
The police are already a huge rights issue when they’re acting without oversight
That feels like it would be a logistics and a just in general nightmare. Does every single individual have an account where they’re forced to stream their footage? If not and it’s all being uploaded to a single channel for a department, who’s in charge of the task of uploading the footage? Who’d even be willing to spend their days doing nothing but uploading footage when your departments internal internet connection comes to a crawl speed because of the person(s) who has/have to upload the footage (because you just know they certainly ain’t paying for them to have their own private network for this in most areas)?
In theory it sounds great but in practice it just sounds like a nightmare. Not defending the police but it just doesn’t seem like a task they’d be willing to take up because of all the work they’d have to put in to make sure it works.
That, and the money they spend doing something like this could obviously be used on something more pressing, like shooting a black man because he didn’t get down on the ground and worship the boots of the officer that killed him after being pulled over on suspicion of absolutely nothing (/s on this part)
If they ain’t up to the task, then they could just quit. I don’t see the problem.
Ah, good. I had “racist profiling
AILLM” on my 2024 bingo cardYes, because AI has a firm grasp on nuanced topics like law enforcement and civilian/human rights…
You may as well play the video to an empty room.
ITT: People who are scared of things they don’t understand, which in this case is AI.
In this case, the “AI” program is nothing more than pattern recognition software setting a timestamp where it believes there’s something to be looked at. Then an officer can take a look.
It saves so much time, and it filters out anything irrelevant. But be careful because it’s labelled “AI”. Scarry.
EDIT: Comments to this comment confirms that you don’t understand AI, because if you did, you’d know that this system who scans video is not a LLM (large language model). It’s not even the same system in its core.
It’s also potentially skipping some of the parts that should be looked at. It depends on the training set.
It’s not that AI is scary, it’s that AI is dumb as fuck.
And I’m sure the criminal acts by police will get filtered out.
deleted by creator
I wonder if it’s one of those AI that can’t see darker skin colors…
“our Pig AI System searched all of the videos. No cop did anything wrong. Ever. The End” ~cop fucks
(Fuck this shit. As usual. Abolish police)
So funny thing, Seattle Police Department did a pilot for AI that did sentiment analysis on police audio and looked for things like racial slurs. They pretty quickly disbanded the project and destroyed the evidence.
(IIRC some folks requested info from the pilot and they claimed to have deleted it.)