tflite

\file

Summary

For documentation, see third_party/tensorflow/lite/core/model_builder.h.

For documentation, see third_party/tensorflow/lite/core/interpreter_builder.h.

For documentation, see tensorflow/lite/core/interpreter.h.

Memory management for TF Lite.

This provides a few C++ helpers that are useful for manipulating C structures in C++.

Main abstraction controlling the tflite interpreter. Do NOT include this file directly, instead include third_party/tensorflow/lite/interpreter.h See third_party/tensorflow/lite/c/common.h for the API for defining operations (TfLiteRegistration).

Typedefs

FlatBufferModel using
::tflite::impl::FlatBufferModel
Interpreter typedef
::tflite::impl::Interpreter
An interpreter for a graph of nodes that input and output from tensors.
InterpreterBuilder using
::tflite::impl::InterpreterBuilder

Functions

DefaultErrorReporter()
GetRegistrationFromOpCode(const OperatorCode *opcode, const OpResolver & op_resolver, ErrorReporter *error_reporter, const TfLiteRegistration **registration)
TfLiteStatus

Classes

tflite::Allocation

A memory allocation handle. This could be a mmap or shared memory.

tflite::ErrorReporter

A functor that reports error to supporting system.

tflite::FileCopyAllocation
tflite::MMAPAllocation

Note that not all platforms support MMAP-based allocation.

tflite::MemoryAllocation
tflite::MutableOpResolver

An OpResolver that is mutable, also used as the op in gen_op_registration.

tflite::OpResolver

Abstract interface that returns TfLiteRegistrations given op codes or custom op names.

tflite::TfLiteIntArrayView

Provides a range iterable wrapper for TfLiteIntArray* (C lists) that TfLite C api uses.

Structs

tflite::StderrReporter

Namespaces

tflite::impl
tflite::op_resolver_hasher

Typedefs

FlatBufferModel

::tflite::impl::FlatBufferModel FlatBufferModel

Interpreter

::tflite::impl::Interpreter Interpreter

An interpreter for a graph of nodes that input and output from tensors.

Each node of the graph processes a set of input tensors and produces a set of output Tensors. All inputs/output tensors are referenced by index.

Usage:


// Create model from file. Note that the model instance must outlive the
// interpreter instance.
auto model = tflite::FlatBufferModel::BuildFromFile(...);
if (model == nullptr) {
  // Return error.
}
// Create an Interpreter with an InterpreterBuilder.
std::unique_ptr interpreter;
tflite::ops::builtin::BuiltinOpResolver resolver;
if (InterpreterBuilder(*model, resolver)(&interpreter) != kTfLiteOk) {
  // Return failure.
}
if (interpreter->AllocateTensors() != kTfLiteOk) {
  // Return failure.
}

auto input = interpreter->typed_tensor(0);
for (int i = 0; i < input_size; i++) {
  input[i] = ...;  interpreter->Invoke();

Note: For nearly all practical use cases, one should not directly construct an Interpreter object, but rather use the InterpreterBuilder.

\warning This class is not thread-safe. The client is responsible for ensuring serialized interaction to avoid data races and undefined behavior.

InterpreterBuilder

::tflite::impl::InterpreterBuilder InterpreterBuilder

Functions

DefaultErrorReporter

ErrorReporter * DefaultErrorReporter()

GetRegistrationFromOpCode

TfLiteStatus GetRegistrationFromOpCode(
  const OperatorCode *opcode,
  const OpResolver & op_resolver,
  ErrorReporter *error_reporter,
  const TfLiteRegistration **registration
)