Blog post

What a Year! Recapping Edge Impulse’s 2025

edge ai
By Glede Kabongo
What a Year! Recapping Edge Impulse’s 2025

As we close out 2025, we’re taking a moment to reflect on the year that transformed not just Edge Impulse, but the entire edge AI landscape. From groundbreaking product launches to a milestone acquisition, from explosive developer growth to innovations that literally exploded (yes, really. We’ll get to that in a moment), this year proved that edge AI has moved from emerging technology to enterprise imperative.

Here are our favorite moments, launches, and milestones from 2025 — an extraordinary year.

The Moment That Changed Everything

Let’s start with the news that reshaped our trajectory. Qualcomm Technologies’ acquisition of Edge Impulse. When we announced this news at Embedded World Nuremberg in March, 2025, it wasn’t just a corporate milestone, it was validation that our vision of making edge AI accessible and deployable at scale had reached a crucial inflection point.

The acquisition brings together Qualcomm Technologies’ world-class silicon with our edge AI development platform, creating an integrated stack that enterprises need. 

What excites us most isn't just the technology synergy — it’s the opportunity to accelerate edge AI adoption across industries. Qualcomm’s global relationships with Fortune 500 companies, OEMs, and systems integrators open doors to manufacturing floors, hospitals, retail chains, smart cities, and critical infrastructure projects where edge AI can deliver transformational value.

For our 250,000+ developers (more on that shortly, too), the acquisition means access to more powerful and optimized tooling, reference architectures that reduce development time from months to weeks, and a roadmap that prioritizes the features and capabilities they need most. For enterprises evaluating edge AI investments, it means reduced risks, faster time- to- value, and the confidence that comes from a vendor backed by one of the world’s most respected technology companies. 

This acquisition doesn’t change our mission; it supercharges it. We remain hardware agnostic, committed to empowering developers of all skill levels, and building tools that turn audacious ideas into production deployments. Now, we have the resources, reach and a runway to do it at a global scale. For edge AI, the best is yet to come, and we’re thrilled to be a big part of that with Qualcomm Technologies.

Product Innovations That Pushed Boundaries

YOLO-Pro: Object Detection Reimagined for the Edge

One of our most anticipated launches this year was YOLO-Pro, our answer to the persistent challenge of running sophisticated object detection on resource-constrained devices. While YOLO models have long been the gold standard for computer vision, deploying them on microcontrollers and edge devices required painful trade-offs between accuracy and performance, not to mention their various licensing challenges.

Optimized specifically for edge deployment in real-life enterprise environments, YOLO-Pro delivers real-time object detection with dramatically reduced memory footprint and power consumption — without sacrificing the accuracy.

Qualcomm’s Acquisition of Arduino and Launch of the UNO Q

In October of 2025, Qualcomm Technologies acquired Arduino in a transaction that accelerated the company’s strategy to empower developers by facilitating access to its unmatched portfolio of edge technologies and products. The move combines Qualcomm’s leading-edge products and technologies with Arduino’s vast ecosystem and community to empower businesses, students, entrepreneurs, tech professionals, educators and enthusiasts to quickly and easily bring ideas to life. And a big crux of it all is empowering AI on the edge — that's where we come in. A big welcome to our Arduino colleagues, looking forward to moving this all forward together. (Read the full announcement.)

With the corresponding launch of the UNO Q, Arduino brought a new level of compute performance, courtesy of a Qualcomm Dragonwing QRB2210 SoC, while keeping the same form factor and pin configuration as the existing UNO products.  

Edge Impulse is proud to support the UNO Q by optimizing several out-of-the-box AI models that are included as part of the new “App Lab” experience included with the UNO Q. By using the Dragonwing QRB2210, the UNO Q can run standard Linux-based applications and add more software capabilities versus MCU-based UNO varieties - including the ability to run machine learning and edge AI models built with Edge Impulse! Learn more.

Empowering Makers with Arduino

Our Arduino partnership and joint webinars brought edge AI to one of the world’s largest maker communities. The Accelerating Industrial Automation with Arduino and Edge Impulse webinar, along with our Digikey kit collaborations, put sophisticated ML capabilities into the hands of Arduino enthusiasts. This is grassroots innovation at scale, and it’s one of our favorite aspects of what we do.

Object Tracking — Following What Matters

Our object tracking launch addressed another critical gap in edge AI workflows. Detecting objects is powerful, but understanding their movement, behavior, and interactions over time is what truly unlocks intelligent applications.

With object tracking integrated directly with Edge Impulse, developers now have the flexibility to build almost any use case, from monitoring worker safety by tracking people near dangerous machinery, counting objects as they cross a region (such as moving on a conveyor belt), or monitoring vehicles or people over time. 

EON Tuner Gets Smarter with Bayesian Optimization

For those who’ve experienced the tedious trial-and-error of model optimization, our EON Tuner Bayesian optimization update was a game-changer. The EON Tuner is a powerful engineering tool that helps you find and select the best performing impulses for your application, given your hardware target resource constraints like RAM, flash, and inference time. With this latest update, it delivers better impulses in less time while using less compute! That’s the kind of acceleration that turns pilot projects into production deployments.

Making Computer Vision Accessible

Our Computer Vision Walk-Through Wizard might sound simple, but it represents our philosophy in action: Edge AI should be approachable for everyone, from seasoned ML engineers to developers just starting their AI journey.

The wizard guides users through the entire computer vision pipeline, from data collection to model training to deployment, with clear explanations and sensible defaults at every step. It’s teaching while doing, enabling while educating. 

The Zephyr Module: Expanding Our RTOS Reach

With the Edge Impulse Zephyr module, we extended our platform support to one of the fastest-growing real-time operating systems in embedded development. Zephyr’s adoption in industrial IoT, wearables, and connected devices made this integration essential for our enterprise customers.

Now, developers working in the Zephyr ecosystem can leverage Edge Impulse’s full suite of tools — AutoML, EON Compiler, model optimization — without leaving their familiar development environment. It’s this kind of seamless integration that accelerates edge AI adoption, meeting developers where they already work rather than forcing them into new workflows.

Demos That Captured Imaginations

Visual Language Models (VLMs)

Sometimes the best way to showcase what’s possible is to build something that makes people say, “Wait, you did what?” 

At Embedded World Nuremberg this past March, we provided a glimpse into the future of AI at the edge. The aim of this demo was to showcase what can be accomplished  with the powerful upcoming Dragonwing IQ9 EVK through model cascading, where multiple models are used in sequence to create a powerful overall system while staying lean and efficient when that power isn't needed.

The use-case was for parking lot management, showing how edge AI could be used to provide vehicle tracking and metadata collection. Read the blog for more details.

From robots that understand natural language commands in context of what they see, to assistive devices that can describe environments for visually impaired users, to industrial systems that interpret complex visual instructions, VLMs at the edge open entirely new application categories.

Racing Ahead at AWS re:Invent and Embedded World, North America

Our race car demo at Embedded World, North America, AWS re:Invent, and The Things Conference turned heads and demonstrated real-world edge AI in action. The races detected cars using Edge Impulse’s high-speed FOMO (Faster Objects, More Objects) object detection algorithm running on Rubik Pi, timing each lap with precision. Each slot car was equipped with an Arduino Nicla Sense ME running on-device crash detection algorithm. When crashes occurred, results were sent back to the Rubik Pi for display, demonstrating seamless multi-device coordination, all happening in real time.

While racing the cars was exciting, the implications extend far beyond the track. The same principles apply to industrial robotics, autonomous vehicles, drone navigation, and any application where latency, reliability, and real-time responsiveness are non-negotiable.

When Edge AI Goes Boom

Perhaps our most fun moment of 2025 was the Edge AI-enabled exploding Mission Impossible prop. Yes, Jim Bruges, one of our developer relations engineers, built a device that authentically recreated the franchise’s iconic self-destructing technology — complete with countdown timer, authentication requirements, and a satisfyingly cinematic (but safe) explosion. 

Beyond the entertainment value, this project demonstrated edge AI’s versatility, proving  that edge AI isn’t just for factories and hospitals — it’s a creative medium that can bring imaginative concepts to life. Sometimes the best way to inspire the next generation of developers is to build something that’s simply cool. This project was edge AI meets Hollywood, and our audience loved it.

Milestones That Defined Our Growth

Five Years of Imagine: Shaping the Edge AI Ecosystem

Our Imagine conference celebrated its fifth anniversary this year, and looking back at how far we’ve come is humbling. What started as a gathering of early adopters has evolved into the premier edge AI conference, bringing together thousands of developers, engineers, and enterprise leaders to explore what’s possible at the edge.

This year’s Imagine program showcased some of the most creative and impactful edge AI projects we've seen: Long-range security sensors without the need for computers or the cloud, gaining maximum visibility night range, interpreting dog barks  to surface emotional, behavioral and health  insights before symptoms appear,  industrial systems transforming manufacturing, and more. 

These aren’t  theoretical applications — they’re deployed solutions making real-world impact today. The Imagine conference has become more than an event — it’s a community, a movement, and a reminder that edge AI is being built by people solving real problems.

250,000+ Developers and Counting

We started 2025 with 160,000 developers in our community. As we close out the year, we're celebrating over 250,000 plus developers building with Edge Impulse — a growth rate that reflects edge AI’s explosive momentum.

56% developer growth in 2025.

But numbers don’t tell the full story. Behind each account is someone building something meaningful: a student learning ML for the first time, an engineer solving a production challenge, a startup bringing a product to market, an enterprise scaling across facilities. This community is the heart of everything we do, and watching it grow is one of our greatest achievements.

Imagine Innovators

This year marked the debut of Imagine Innovators, our first-ever edge AI developer conference designed to celebrate the remarkable people and expertise pushing the boundaries of edge AI to solve real-world problems. 

The event brought together hundreds of developers, engineers, and innovators from around the globe to showcase groundbreaking applications spanning healthcare, industrial automation, and more. Watching our community come together to share knowledge, inspire one another, and chart the future of edge AI reminds us why we built this community: to empower and inspire innovators to turn their vision into reality.

Global Virtual Hackathon Edge AI Contest

A big congratulations to the winners of our 2025 Global Hackathon Challenge. We took the phrase "learning by doing” to the next level with a global edge AI hackathon aimed at students and engineers alike. Running from October 30 through November 30th, participants built machine learning models and real-world applications with Edge Impulse, showcasing their skills, creativity, and community spirit by designing ML models and deploying them directly to edge devices. Check out the winners and finalists.

Taking Over Elektor Magazine

When Elektor magazine invited us to guest-edit a special edition entirely dedicated to edge AI, we jumped at the opportunity. 

The Edge Impulse Special Edition brings edge AI tutorials, real-world case studies and use cases, technical deep-dives and more to Elektor’s global readership of electronics enthusiasts and professional developers.

Seeing our platform and projects featured in a publication with Elektor’s legacy feels like further validation that edge AI has truly arrived. We’re no longer explaining what edge AI is — we’re showing people how to build with it, and publications like Elektor are essential partners in that education mission.

Edge AI Real-World Impact

The edge AI transformation story wouldn’t be complete without discussing how it’s solving critical challenges. Here are just a few examples of edge AI making a tangible impact in the real world. 

HP – Bringing Voice Control to Earbuds & Headsets

A couple years ago, HP set out to improve the clunky way users have to interact with incoming calls through the traditional headset user experience, where responding to calls would require a physical button press on a hard-to-spot part of the headset, or necessitated utilizing the phone itself. This could be disruptive, especially when hands are tied up with typing or writing. 

The solution? A cutting edge voice-control feature, built directly into their flagship headsets, to allow users the ability to answer or decline phone calls from their connected devices. Using Edge Impulse, HP Poly was able to collect keyword data, train an industrial-grade ML model, and deploy it into their own custom workflow within just months.

Business impact: The Poly Voyager Free 60 earbuds, Poly Voyager Surround 80 and 85 headsets, and Poly Voyager Legend 50 headsets are now available for sale globally, with the voice-command functionality fully accessible. And the functionality works with eleven languages.

Read the full case study.

GlobalSense — Giving Vehicles Smart Hearing

Some major car component faults (like engine, transmission, and accessories) can only be detected through their sound!” The problem is, diagnosing cars by sound is hard. Inspectors can’t hear the sound directly from its source, and background noise, time pressure, and differences in experience make it even harder. 

This leads to costly consequences. For example, millions of dollars are lost each year in auction arbitrations, while dealerships suffer from expensive trade-in adjustments and wasted repair labor due to misdiagnosis. 

The solution: unlocking the vehicle’s voice and turning it into actionable insight using edge AI,  which listens with consistency, precision and scale.

Business impact:

Read the case study on edge AI for automotive diagnostics

Blind Case Study: Quality control — Car Components Manufacturer

With edge AI for real-time inspection and defect detection, manufacturers can automate the inspection process by deploying computer vision models in edge devices, enabling real-time detection of defects, anomalies, or non-conformities in products.

A European manufacturer of components for a luxury car brand is using AI for computer vision end-of-line quality, leveraging Edge Impulse’s anomaly detection library, FOMO-HD. Their machine learning model detects malformed parts and incorrect part placement on the assembly line during the manufacturing process

Business impact: Reduction in machine downtime, production waste, and machine damage/maintenance. This brought enhanced productivity and a measurable cost savings of $1.6M. 

Looking Ahead

As we reflect on 2025’s  achievements, we’re more excited about what’s coming than what we’ve  accomplished. The Qualcomm acquisition accelerates our roadmap. Our developer community is building applications we never imagined. Enterprises are moving from pilots to production deployments at unprecedented speed.

Edge AI is no longer the future — it’s the present. And if 2025 taught us anything, it’s  that when you combine powerful technology with a passionate community and a mission to democratize innovation, the results exceed even the most optimistic predictions.

Here’s to our favorite things from 2025, and to even more exciting innovations in 2026. The edge AI revolution is just getting started. We can’t wait to build it with you.

Comments

Subscribe

Are you interested in bringing machine learning intelligence to your devices? We're happy to help.

Subscribe to our newsletter