Edge Impulse logo
Blog Post

AI Goes on Patrol

Computer Vision, TinyML, Machine LearningAn autonomous security robot that listens for suspicious sounds, then goes on patrol to look for intruders.

Nick Bild

May 30, 2022

    It may be true that crime does not pay, but as a nation, we certainly pay a price for it.  Residential burglaries result in nearly $3.5 billion in economic damages in the United States annually. While very substantial, this number pales in comparison to the $68.9 billion worth of products stolen from retailers each year. Fortunately, there are ways to reduce the risk of falling victim to these types of crimes — research by the University of North Carolina Department of Criminal Justice & Criminology has shown that a full 50% of would-be burglars choose to move on to a different target if security cameras are present.

    Traditional security cameras can only monitor a certain amount of area and are subject to having blind spots caused by objects that block their view. Getting total coverage of a large area, like a warehouse, can require lots of cameras and monitoring equipment, and that cost can add up.  In an effort to address this problem, I have developed an autonomous, robotic security system that I call Shield Bot. This robot uses machine learning to listen for suspicious sounds, then it goes on patrol to seek out intruders. Shield Bot then sends an alert to its owner, while emitting loud sounds and bright lights to scare away the thief.

    I designed Shield Bot using low-cost parts, such that it could be widely deployed in both residential and commercial applications. The base of the robot is an iRobot Create 3, which is essentially a robot vacuum without the cleaning components. It is a great platform for general purpose robotics projects, with a pair of motorized wheels with encoders, several infrared and bump sensors, an accelerometer, a gyroscope, and more. These sensors and actuators can be accessed via Robot Operating System 2. To serve as the brains of the robot, I chose the powerful Raspberry Pi 4 single board computer. A USB webcam/microphone was added to capture images and audio, and a speaker was added to play a loud siren sound on command.

    Building an object detection pipeline

    Since a robot’s battery can only supply power for a limited time before needing a recharge, I wanted it to stay mostly idle until it was needed. Towards this goal, I built an audio anomaly detection pipeline in Edge Impulse that is capable of detecting any sounds that are unusual for a particular environment. This type of algorithm prevents false alarms from normal sounds, like an HVAC unit turning on. Shield Bot remains on its charging dock while it continually samples audio with its microphone and runs them through this anomaly detection pipeline.

    When a suspicious sound is detected, the robot backs away from its charging dock, then drives around, periodically spinning in a 360-degree circle while capturing images. These images are processed by a second machine learning pipeline that I built with Edge Impulse, this one designed to detect objects in images. I used the data acquisition and labeling tools to annotate images with the location of people in a training dataset that I built, then trained a FOMO model.

    Data collection

    When Shield Bot detects the presence of a person, a loud police siren sound is played through the speaker, and the iRobot’s onboard LED ring brightly flashes red and blue. Additionally, a notification is sent to the owner of the robot to alert them to the presence of an intruder. This gives the owner the opportunity to call law enforcement, but also more immediately deters the intruder from continuing on in their criminal activity.

    All of the models that were built and trained with Edge Impulse were deployed locally to the Raspberry Pi. In this way, latency is very low, and there are no privacy concerns related to sending images and audio to the cloud. In the future, I would like to work on improving Shield Bot’s navigation capabilities. At present, the robot will drive around, and when it encounters an object, it will turn in a different direction before resuming its mission. A computer vision, or perhaps LiDAR-based approach built with Edge Impulse would allow the robot to drive in an optimized pattern to most efficiently cover a space. Got ideas? I would love help in extending the robot’s functionality in this area. Check out the project page for more details, and to build your own upgraded version of Shield Bot. Also be sure to check out Shield Bot in action in the video below.

    Want to see Edge Impulse in action? Schedule a demo today. 


    Are you interested in bringing machine learning intelligence to your devices? We're happy to help.

    Subscribe to our newsletter