commit | 9fc1fcbd2239b6c8f6fa8399c6580d4f350e2150 | [log] [tgz] |
---|---|---|
author | Ben Nissan <bennissan@google.com> | Fri Aug 19 16:44:36 2022 +0000 |
committer | Ben Nissan <bennissan@google.com> | Tue Aug 23 17:30:58 2022 +0000 |
tree | fd7b7d30537d59b8f0a635b20c511543623f87aa | |
parent | e4266a8951b0f53a80bc54392276c355583c0a53 [diff] |
Return StatusOr from Classify() methods This CL updates the Classify() methods in NLClassifier and BertNLClassifier to return StatusOr wrappers for their contents, allowing errors to be propagated across the JNI boundary. To enable this, it fixes a bug when checking for dynamic vs static input tensors in bert_nl_classifier.cc and throws exceptions from RunClassifier(). Bug: 242926638, Bug: 242926783 Test: atest tflite_support_classifier_tests Test: atest TfliteSupportClassifierTests Change-Id: Ib93385061ace97cf96a168e3d76d2d9162c8055f
TFLite Support is a toolkit that helps users to develop ML and deploy TFLite models onto mobile devices. It works cross-Platform and is supported on Java, C++ (WIP), and Swift (WIP). The TFLite Support project consists of the following major components:
TFLite Support library serves different tiers of deployment requirements from easy onboarding to fully customizable. There are three major use cases that TFLite Support targets at:
Provide ready-to-use APIs for users to interact with the model.
This is achieved by the TFLite Support Codegen tool, where users can get the model interface (contains ready-to-use APIs) simply by passing the model to the codegen tool. The automatic codegen strategy is designed based on the TFLite metadata.
Provide optimized model interface for popular ML tasks.
The model interfaces provided by the TFLite Support Task Library are specifically optimized compared to the codegen version in terms of both usability and performance. Users can also swap their own custom models with the default models in each task.
Provide the flexibility to customize model interface and build inference pipelines.
The TFLite Support Util Library contains varieties of util methods and data structures to perform pre/post processing and data conversion. It is also designed to match the behavior of TensorFlow modules, such as TF.Image and TF.text, ensuring consistency from training to inferencing.
See the documentation on tensorflow.org for more instruction and examples.
We use Bazel to build the project. When you're building the Java (Android) Utils, you need to set up following env variables correctly:
ANDROID_NDK_HOME
ANDROID_SDK_HOME
ANDROID_NDK_API_LEVEL
ANDROID_SDK_API_LEVEL
ANDROID_BUILD_TOOLS_VERSION
Let us know what you think about TFLite Support by creating a new Github issue, or email us at tflite-support-team@google.com.