Handmade Drawing Recognition Interface as from my Smartphone
Today, we are surrounded by a huge number of IoT devices using touch interfaces. The simplest example is the smartphones that we use on a daily basis. By the way, it was the “drawing gesture recognition” feature in Sumit’s smartphone that inspired him to create this experiment.
Sumit wondered if a touchpad can run on a simple microcontroller and if an MCU can classify complex gestures in real-time, delivering inference at the edge with low power and minimal flash usage. A little spoiler: yes, it can :)
In his tutorial, Sumit will show how to make a TinyML model for an embedded device to recognize complex drawing gestures like alphabets, and special symbols on TFT touchscreen display units. For that, he used only three components:
- a device to collect some examples of each gesture;
- a no-code TinyML platform to train the model and embed it into a small MCU;
- open applications on my computer to record the user’s strokes on the screen and use the TinyML inference algorithm to figure out what gesture an application should open.
Curious to see the outcomes? Find the detailed guidelines of the experiment in the full version of Sumit Kumar case on hackster.io