Exploring STM32N6

A few months ago, I attended the ๐’๐“๐Œ32 ๐’๐ฎ๐ฆ๐ฆ๐ข๐ญ, a global event where they launched their latest microcontroller series – ๐’๐“๐Œ32๐6. This microcontroller is geared towards AI-ML applications due to its integrated neural processing unit and high-performance core.

After the launch, I immediately got my hands on the STM32N6 discovery kit to dive into the world of Edge ML.

What started as an exploration of Edge ML has now expanded into a broader learning journey of

โœ… Camera interfaces

โœ… Displays, touchscreens, GUIs and TouchGFX

โœ… Azure RTOS, ThreadX and advanced interfaces like USB and ethernet

โœ… Audio processing

โœ… Advanced microcontroller based hardware with external memories, displays and cameras

Iโ€™ve already built a few simple applications using TouchGFX and run multiple ML and computer vision based demo applications.

My current target is to learn camera and display interfacing and integration. Eventually I will work my way up to interfacing audio/visual signals with various ML models to build embedded AI applications.

Guest Seminar

I was honored to be invited to BugBuster 2.0, hosted by Sardar Patel Institute of Technology, to deliver a 1.5-hour pre-event workshop on PCB design and debugging.

The event brings together 150+ students from across the country for a 24-hour hardware hackathon.
We explored the art and science of PCB design, its components, debugging techniques, and how to get started with embedded systems.A special thanks to Prof. P.V. Kasambe and @Woodrow of the I.E.T.E organizing team for their dedication and effort in creating such an outstanding event!

First Project using NanoEdge AI Studio

Last week, I shared my excitement about exploring Edge Machine Learning with the X-NUCLEO-IKS4A1, NUCLEO-F401RE, and NanoEdgeAI Studio. 

Iโ€™m excited to share the experience of my first EdgeML application!

MEMS sensor on IKS4A1 senses motor vibrations and F401RE runs an ML model which estimates the speed range based solely on vibration data.

NanoEdgeAI Studio is an intuitive tool designed by STM to easily implement ML models for microcontrollers. 

Here’s what it enabled me to do:
1๏ธโƒฃ Sensor Data Collection: NEAI Studio works with the microcontroller to log the data stream into a csv file which is used for training the ML model.

2๏ธโƒฃ Model Training: Based on the data collected, NEAI Studio automatically evaluates multiple models, trains them, benchmarks them, and identifies the best fit for your data. 

3๏ธโƒฃ Model Validation and Deployment: You can test various models on your host PC before deploying it to the target microcontroller. The input data stream can be stored data or a stream of fresh data from your microcontroller. You can compare between different parameters like inference time, RAM size, flash size and accuracy.
NEAI Studio then generates a lightweight, ready-to-use model library that I could integrate directly into my application on resource-constrained microcontrollers.

My Application

My final application, running on the NUCLEO-F401RE, has two modes:

๐Ÿ”น Data Logger Mode: Captures and logs vibration data for training and validating the model initially.

๐Ÿ”น Inference Mode: Runs the ML model generated by NanoEdgeAI Studio to predict the speed range of the DC motor based on its vibrations. 

This project shows how it is possible to bring ML to the edge, even on hardware with limited resources. The potential applications for predictive maintenance, anomaly detection, and more are immense!

Excited to Dive into Edge Machine Learning!

I recently got my hands on the X-NUCLEO-IKS4A1, a powerful sensor evaluation board from STMicroelectronics thatโ€™s perfect for exploring Edge AI applications. ๐ŸŒŸ

This incredible board is packed with sensors such as accelerometers, gyroscopes
magnetometer, barometer etc. Some of these sensors even have the capability to perform signal processing and run ML models directly! 

Paired with a Nucleo board, this platform offers an excellent foundation for real-time data processing at the edge.

My goal? To deep-dive into Edge Machine Learning; harnessing this technology for low-power, highly intelligent systems. Over the coming weeks, Iโ€™ll be:
๐Ÿ”น Learning how to integrate sensor data with AI/ML models on microcontrollers.
๐Ÿ”น Exploring edge computing use cases like predictive maintenance and motion analysis.
๐Ÿ”น Sharing my progress, challenges, and key takeaways.

The potential of Edge AI is massive, from IoT to industrial automation, and Iโ€™m thrilled to start this journey.