Teen Accidentally Swarmed by Police After AI Security System Mistakes Doritos Bag for Gun
A teenager in Baltimore was surrounded by police officers and briefly detained after an AI-powered security system flagged a crumpled bag of Doritos as a potential weapon. The incident, reported by NBC affiliate WBAL-TV 11, highlights growing concerns about the reliability and real-world consequences of artificial intelligence in public safety. Taki Allen, a high school student, was sitting outside Kenwood High School with friends after football practice when a large number of police cars arrived. According to Allen, officers approached him with weapons drawn, ordering him to get on the ground, put his hands behind his back, and be cuffed. “At first, I didn’t know where they were going until they started walking toward me with guns,” he said. “I was like, ‘What?’” Allen said he feared for his life during the encounter. “It was mainly like, am I gonna die? Are they going to kill me?” he recalled. After being searched and found to have no weapons, officers discovered the crumpled bag of chips on the ground near where he had been sitting. The school’s principal confirmed that an alert was triggered around 7 p.m. indicating a possible weapon on campus. School safety officials quickly reviewed the alert and canceled it after determining there was no threat. The principal then contacted the school resource officer, who reached out to local police for backup. Officers responded, conducted a search, and confirmed Allen was unarmed. While neither school officials nor police have officially confirmed that the Doritos bag was the source of the alert, they have not denied it either. The incident appears to stem from a security system provided by Omnilert, a company that markets itself as a pioneer in AI-driven active shooter prevention technology. Omnilert’s website lists an AI-powered gun detection solution designed for schools, and WBAL-TV reports that Kenwood High School began using the system last year. Gizmodo reached out to both Kenwood High School and Baltimore County Police for comment, but no official response was provided. Omnilert also declined to comment on the specific incident. The event underscores the risks of relying on AI systems that lack nuance and context—especially in high-stakes environments like schools. While such technologies are promoted as tools to enhance safety, incidents like this reveal how easily they can misinterpret everyday objects, leading to traumatic experiences for students and raising serious questions about oversight, accountability, and the limits of automation in security.