
TfLiteTensor *inputTensor = TfLiteInterpreterGetInputTensor(interpreter, 0) TfLiteInterpreter *interpreter = TfLiteInterpreterCreate(model, options)


TfLiteInterpreterOptions *options = TfLiteInterpreterOptionsCreate() TfLiteModel *model = TfLiteModelCreateFromFile("model.tflite") This model solves the simple linear regression problem described in the post. You can read post how to convert TensorFlow 2 model to TensorFlow Lite model, or you can download prepared model from Internet: wget -O model.tflite Before starting, install GNU C compiler: sudo apt install -y gccįor testing, we need to have TensorFlow Lite model. deb package because no longer needed: rm -rf b rm -rf tensorflow-lite_64.deb Testing TensorFlow Lite (C API)ĭebian package contains a shared libraries of C and C++ APIs. When the download is finished, install TensorFlow Lite: sudo apt install -y. deb package from releases page of the repository: wget wget Execute the following command to download the. Testing performed on Raspberry Pi 4 Model B (8 GB). TensorFlow Lite was built with the following features: We have created a release on GitHub repository and uploaded the b package. Binaries are compatible with Raspberry Pi OS Bullseye (32-bit and 64-bit).

deb) that contains precompiled TensorFlow Lite 2.12.0 binaries for Raspberry Pi 3 Model A+/B+ and Raspberry Pi 4 Model B. This tutorial shows how to install precompiled TensorFlow Lite 2.12 on Raspberry Pi. Before running the model, we must convert a TensorFlow model to TensorFlow Lite model using TensorFlow Lite converter. We cannot train a model using TensorFlow Lite. TensorFlow Lite is an open-source library that enables to run machine learning models and do inference on end devices, such as mobile or embedded devices.
