AI gun detector
A new startup using computer vision software to turn security cameras into gun-detecting smart cameras has raised $2.2 million in venture capital funding in a round led by Bling Capital, with participation from Upside Partnership and Tensility Venture Partners.
Aegis AI sells to U.S. corporations and school district its technology, which scans thousands of video feeds for brandished weapons and provides threat-detection alerts to customers within one second, for $30 per camera, per month. Coupling AI and cloud computing, Aegis integrates with existing camera hardware and video management software, requiring no on-site installation or maintenance.
We can take over the role of a security guard with much higher accuracy at a much lower cost, Aegis co-founder and chief product officer Ben Ziomek tells TechCrunch.
I’m skeptical.
July 2nd, 2019 at 3:54 pm
What does the software think of a paper cutout of a pistol taped to the back of a fellow student? Asking for a friend.
How well does it detect the dangerous poptart handgun?
Finger guns? Bent sticks on the playground? Zero tolerance advocates want to know!
And having replaced the armed School Resource Officer, how does the software stop a gun brandisher from shooting up the school?
July 2nd, 2019 at 4:13 pm
probably works ok in most situations, but the failure mechanisms for AI vision are surreal:
https://www.youtube.com/watch?v=XaQu7kkQBPc
no reason you couldn’t reverse the process and make a hydrodip that fools vision systems into thinking your firearm is a turtle.
July 2nd, 2019 at 5:27 pm
perfectly do-able with machine learning to pattern match, then running by someone to verify visually. But there will be a lot of false positives that will push the response costs to local police departments.
way easier and cheaper to have 3-5 teachers and admins armed and trained.
July 3rd, 2019 at 5:37 am
So they have to brandish the gun first?
Isn’t that usually the last step before they start shooting?
Will it detect holstered guns? Gun silhouettes printing on clothing? How does it detect a gun hidden in a bookbag, briefcase, or even a puffy coat?
$30/ month times 9 months times 50-100 cameras is some serious money, but still less than hiring Barney Fife or Scott Pederson; and sadly just as ineffective.
July 3rd, 2019 at 7:09 am
Add opencarry jokes here:
July 3rd, 2019 at 8:31 am
Machine learning works pretty well for some applications. I have no background in this other than a single college class, but I do work with some of the best data scientists in the world in my (not video surveillance) industry. Here’s what they tell me: algorithms and heuristics used by the ML models are well understood, and nobody has any “secret sauce” for that component. The hard part is getting a really large dataset of true positives (in this case: yes, that’s bad guy with a gun) and true negatives (in this case: nothing bad is happening).
Video, however, comes with a unique challenge, and that is the “attributes” that the ML model can be trained against are almost infinitely variable. Not the values of the attributes, we see that in my industry. The attributes themselves.
Where does that leave us?
This company almost certainly does not have enough true positive data or enough compute to make an ML model that works. In my opinion.
July 3rd, 2019 at 4:52 pm
Hacking to commence immediately. I assume this is as secure as all else in the IoT.
July 3rd, 2019 at 6:58 pm
“…how does the software stop a gun brandisher from shooting up the school?”
I guess it’ll have to be linked to an automatic, wall-mounted firearm which would instantly kill anyone carrying a pop tart gun or an electric drill, glue gun, calking gun, et al.
July 3rd, 2019 at 6:59 pm
And the poor cops entering to defend the kids from a shooter would be in trouble as well. Or maybe the cops just cower outside and wait for the shooting to stop?
July 3rd, 2019 at 8:43 pm
I wonder how well this software would react to an outline of the state of Florida?
July 4th, 2019 at 2:34 pm
And it can’t STOP the shooter… Replacing SROs with this is stupid.
July 4th, 2019 at 3:38 pm
Let’s see what this startup’s insurance policy covers and does not cover.
That should tell it all.
July 5th, 2019 at 8:49 am
A followup.
The London Metropolitan Police are large, sophisticated, and well-funded. If reporting is to be believed (https://www.theregister.co.uk/2019/07/04/met_police_slammed_for_facial_recognition_practice/) they cannot make facial recognition technology work. In many respects this is MUCH easier than using ML to detect a gun-related threat from video surveillance.