Killer AI Drone Tracks Faces And Built In Only Hours, We Have No Defense
Luis Wenus, an entrepreneur and engineer, posted recently on X that he and colleague Robert Lukoszko incorporated the power of AI into a commercially available drone, turning it into a people chaser in just a few hours. The implications of such technology being so readily available are concerning.
Check out the video and see for yourself how scary this could all become.
Wenus said that he and his friend coded AI capabilities into the software of the drone to give it the ability to utilize facial recognition. They wanted the drone to chase people around “as a game,” but it didn’t take long for the reality of what he had created to become clear.
It’s no big revelation that a drone could be used as a weapon, but it is a strange thought to consider a drone with facial recognition being used to target specific individuals. It’s strange because we really have no defenses in place against such an aggressive move.
They coded AI capabilities into the software of the drone to give it the ability to utilize facial recognition
“There are no anti-drone systems for big events and public spaces yet.”
If an army of AI-equipped drones carrying small explosives were to descend upon a crowd in a big city in the U.S., there would be no immediate defense in place to counteract the attack. There are no net guns, GPS spoofers, or high-energy lasers in place to deploy in defense.
The AI facial recognition on the drone the two scientists reprogramed has a range of 33 feet, which isn’t too impressive, but imagine what a beefed-up range could do. You could have a drone that could spot a target from across a football field (or further) and attack.
They may not be utilizing AI yet, but Ukraine is already using drones equipped with explosives in their fight against their Russian enemy. AI and drone technology are also no strange concepts to the U.S. Military. In 2017, the Navy showed the world that it had the ability to control and utilize a swarm of 30 drones with explosives.
There are no anti-drone systems for big events and public spaces
Luis Wenus is a self-proclaimed “open source absolutist,” meaning that he firmly believes in sharing code and software through various open-source channels, but he has stated that he isn’t planning on sharing any of the coding he used to program the AI drone.
He also warned people that the technology could easily be used to orchestrate a terrorist attack. While it may be a bit of a challenge to understand enough code to reprogram a drone now, it will become easier as advancements in AI technologies progress, as AI will assist in code writing.
In addition, Wenus self-identifies as an “e/acc,” which means that he follows a school of thought among those who study AI (in tandem with a drone or anything else for that matter) and believes in the progression of AI research regardless of the downsides. He believes that the positives will always ultimately outweigh the negatives when it comes to the advancement of AI in the world.
Though his beliefs on AI and drones may be on the extreme side of the spectrum, Wenus might be onto something. The critically fast pace of technology is not something any of us can stop or control at this point. What’s that old saying, “If you can’t beat ‘em, join ‘em?”