Who Goes There? Intruder Detection Using Ultra-Low-Power Thermal Vision

If you want to keep an eye on your home while you are away, sure, you could buy an off-the-shelf security camera, but where is the fun in that? And anyway, you might want to customize your security camera so that it does not alert you every time your cat decides to jump up on the kitchen counter, or do all of the other forbidden things that you would never allow if you were home. As an Edge Impulse user, chances are that your first instinct would be to build your own security camera, which is just what hardware hacker Naveen Kumar has done.

The device consists of a Raspberry Pi Pico to provide the computational horsepower, and a Pimoroni MLX90640 thermal camera breakout to capture a low-resolution representation of environmental temperature levels. A Seeed Studio Wio Terminal is used to display the output from the thermal camera, and also to help with collecting the training dataset for a machine learning model. Kumar used this setup to collect thermal image data that fall into three categories: person, object (miscellaneous non-person items), and background. Thermal measurements were captured at a resolution of 24x32 pixels, and were saved into local storage on the Wio Terminal.

Thermal image dataset

The collected dataset was then uploaded to Edge Impulse using the command line interface (CLI). Parameters to the CLI command also split the images into separate training and testing datasets. Once the upload completed, the data was available to inspect in Edge Impulse Studio. Kumar designed an impulse with a neural network classifier learning block, and then after tweaking a few parameters, he trained the model with a button click.

When the training process finished, metrics were presented to help assess how well the model performed against the test dataset. The network was found to classify heat signatures correctly in 80% of cases. While that is not an especially high level of accuracy, it is actually quite good considering that a very small number of examples from each class were provided for training. With a larger, more diverse training set, it would be expected that the accuracy level would rise considerably.

Deploying the model to the Pico

With the concept having been proved in Edge Impulse, it was time to deploy the model locally to the Raspberry Pi Pico. This was done by using the EON compiler, which can dramatically reduce the computational resources required of a model without sacrificing accuracy, to create a C++ library that can run natively on the Pico. The library made it simple for Kumar to run inferences with the neural network in his own code. After dragging and dropping this package to the Pico, everything was in place to run the classifier offline. He decided to also add an LED to the device for demonstration purposes, such that he could illuminate a light whenever a person is detected.

Not counting the display and LED, which are only needed for demonstration purposes, the device draws less than 100 mA of current. Kumar speculates that this could be reduced substantially by only running inferences when detected temperatures are sufficiently high that a human might be present in the image. He also notes that the LED could be replaced with a LoRaWAN transmitter that sends alerts to make a more practical device, which would consume very little power. A device of this sort could run on battery power for several days without a recharge.

Aside from home security, Kumar also sees applications in industry and rescue operations where similar devices may be useful. It is also a great entry level project that anyone could take on to learn more about tinyML and Edge Impulse, and also have an interesting device to show off afterwards. Be sure to check out Kumar’s write-up and public project page for all the details.


Want to see Edge Impulse in action? Schedule a demo today.

Comments

Subscribe

Are you interested in bringing machine learning intelligence to your devices? We're happy to help.

Subscribe to our newsletter