Some sensor processing using machine learning


This section assumes you know about machine learning already. If you don’t, there are a bunch of concepts which you may find hard to grasp, particularly within the colab notebook. Check out the tutorials on the first page if you want to know the deal about tensorflow and machine learning. Specifically here, the models I’m building are supervised learning models, trained with ground truth data, which use convolutional neural network models to process time-series data.

In this page I will discuss a couple of examples of machine learning. For executing the machine learning models, you will use the tflite interpreter built into the websensors platform or installed on the raspberry Pis in the lab. To train the models, we will use Google Colaboratory.This is a platform that allows you to run python scripts from a ‘notebook’ on the web. These scripts execute remotely on a virtual machine owned by google, which has a full desktop installation of python, with tensorflow and many other useful libraries pre-installed. This allows you to train simple models at reasonable speeds.

Classifier example - knock knock lock

Knock-knock lock is a lock which uses sound to sense knocking sounds. It will open if you knock in a particular pattern. For example you might use the shave and a haircut pattern which looks like this (where * is a knock and - is a pause)

*-***-*---*-*

This is an example of using a machine learning algorithm to create a simple classifier, telling us if a knock pattern is either correct, or not correct.

So, first things first, lets create some training data. For this we need to record data from the sound sensor, along with some ground truth as to whether what we are hearing is the correct or the incorrect knock pattern.

For simplicity, I do this by recording two csv files using the code below. Each one contains a bunch of knock patterns, separated by silence of at least 1 second. In one file, the knock patterns are all good knocks (I use shave and a haircut). In the other file, the knock patterns are all different knocks.

I capture these using the script below

Once you’ve captured some ground truth data, you can go to Google Colaboratory to train a model on the full copy of python there. An example of how to do this in the Colab notebook. Once you’ve trained a model, assuming it worked okay, you should get an output file which is a .tflite file. This will work in the cut-down version of tensorflow which is installed on both the Raspberry Pis and on the websensor platform.

You can run the saved model (the .tflite file) in python using the tflite_runtime module. To do this I need to upload the tflite file to my python environment. You can do this on webpython by dragging it onto the box above the start/stop buttons. On the raspberry pi, just copy it back across to the pi.

Regression model example - tempo tapper

The tempo tapper calculates a tempo in beats per minute that you are tapping.

This uses an almost identical model to the classification model, but the output instead of being a choice of classifier class, is instead a value in beats per minute. This is an example of using machine learning to do regression, or to estimate the value of some quantity.

We generate ground truth for this differently - instead of tapping and telling the system what taps we’re doing, the ground truth generator script tells us when to tap to create a range of tempos.

We can similarly load this ground truth data in another Colab notebook, and train an almost identical neural network to do the regression.

Load your ground truth into the colab notebook and run it as before. It takes quite a time to train the model. You should get out tempomodel.tflite, which goes into the script below to run it.