TensorFlow Lite is the “evolution” of Google’s previous efforts to get TensorFlow AI onto mobile devices. Its existing TensorFlow Mobile API will remain operational but is no longer the recommended solution for mobile AI. Google advised developers to start moving towards TensorFlow Lite for new apps but stressed Mobile will still be supported for existing products.
The new platform is designed from the ground-up to facilitate the “low-latency inference” of machine learning models on mobile devices. These are constrained by low-performance processors, limited memory resources and relatively small storage reserves. TensorFlow Lite addresses all of these issues by specifically optimising models for the confines of smartphone hardware.
“TensorFlow has always run on many platforms, from racks of servers to tiny IoT devices, but as the adoption of machine learning models has grown exponentially over the last few years, so has the need to deploy them on mobile and embedded devices,” said Google. “TensorFlow Lite enables low-latency inference of on-device machine learning models.”
READ NEXT: Google details “enterprise-grade” security chip in the Pixel 2
On most devices, TensorFlow Lite will run AI models in a highly optimised form on the CPU. It’s also ready to use with dedicated machine learning coprocessors that are starting to appear on some newer smartphones. It’s expected that custom hardware will become more common over the next year so Google’s designing TensorFlow to help developers access the extra resources they provide.
TensorFlow Lite initially supports a handful of pre-trained models optimised for mobile devices. These include MobileNet, enabling computer vision across over 1,000 object classes, and the Inception v3 image recognition model. Google’s also offering access to its one-touch reply generator Smart Reply. These models are available for immediate use inside apps but developers will also be able to create their own models from custom datasets as required.
The toolkit is available in preview form from today. Google said it’s currently intentionally limited in functionality because the focus has been on establishing a performance baseline over some of the most common AI models. The scope and capabilities of TensorFlow Lite will grow over time as development continues.