Artificial intelligence, or AI, is being used to help secure sites from sports arenas to churches and schools. The technology is being used to scan for weapons, including guns, knives and explosives as people walk between standing panels. If a weapon is spotted, security standing by is alerted.
Massachusetts based Evolv has used the technology to scan roughly 300 million people across the country since the system went live in 2019, second only to the TSA.
"Think about walking directly right into a venue, into a school, into a building without breaking stride," said Peter George, the CEO OF Evolv, touting the technology as far less obtrusive than traditional metal detectors. "And if you don't have a weapon on you, you get to walk right in. And if you do, we can identify that."
Evolv's technology is being used at major sports stadiums, urban hospitals, schools, courts and big casinos, among other venues.
"It's a free flow touchless weapons screening system," explains Steve Morandi, Evolv's Vice President of Product Management. "It really works with a combination of A.I., advanced sensors and cameras in a really integrated way. And we're basically detecting weapons versus everyday metal objects that we all carry."
Bay State based Liberty Defense has combined AI technology with 3D imaging capable of hunting for non-metallic threats, like powders, pipe bombs or ghost guns made from plastic.
"We're looking for any type of anomaly, any type of threat that might be concealed," explains Bill Frain, CEO of Liberty Defense. "So, whether that's a gun or a knife or plastic explosive that could do damage or maybe even drugs or liquids."
The new HEXWAVE system will be tested this summer at a Hindu temple near Atlanta, the University of Wisconsin and the Toronto Pearson International Airport.
The proliferation of AI technology in security has alarmed critics.
"What we don't want to see is America turned into a check point society where we're searched every time we go to any public gathering to a church or other, a place of worship or a little league game or what have you," says Jay Stanley, a senior policy analyst with the ACLU Speech, Privacy, and Technology Project.
Regarding privacy concerns, Frain notes, "We don't save any of the data. No images are stored."
George says, "We use our artificial intelligence to discriminate between a phone and the firearm, but we're actually not looking at the people at all. We're only looking for weapons."