By Masoud Koleini
RedisAI is a Redis module designed for running deep learning graphs over data stored in Redis. Redis has a low memory footprint, therefore with growing interest in running machine learning applications on small devices, RedisAI becomes a good candidate for edge inference solutions.
In this talk, we demonstrate the performance and memory usage for running machine learning inference using Tensorflow/Tensorflow Lite backend on a Jetson Nano device (with Arm Cortex A57 processor and 128-core Maxwell GPU). We show an implementation of an image classification system over a stream of data, and orchestrate it with k3s Kubernetes distribution on edge devices.