Now, the search giant has launched the developer preview of a new machine learning toolkit designed specifically for smartphones and embedded devices and will be available for both Android and iOS app developers. This platform will allow developers to deploy AI on mobile devices. It enables on-device machine learning inference with low latency and a small binary size. TensorFlow Lite also supports hardware acceleration with the Android Neural Networks API. “As you may know, TensorFlow already supports mobile and embedded deployment of models through the TensorFlow Mobile API,” the TensorFlow team wrote in a blog post today. “Going forward, TensorFlow Lite should be seen as the evolution of TensorFlow Mobile, and as it matures it will become the recommended solution for deploying models on mobile and embedded devices. With this announcement, TensorFlow Lite is made available as a developer preview, and TensorFlow Mobile is still there to support production apps.” Burke notes that it is a “crucial step toward enabling hardware-accelerated neural network processing across Android’s diverse silicon ecosystem.” TensorFlow Mobile allows developers to incorporate TensorFlow models that work in a desktop environment, on mobile devices. However, applications created using TensorFlow Lite will be lighter and faster than similar applications that use TensorFlow Mobile. Although, not all use cases are currently supported by TensorFlow Lite. “TensorFlow has always run on many platforms, from racks of servers to tiny IoT devices, but as the adoption of machine learning models has grown exponentially over the last few years, so has the need to deploy them on mobile and embedded devices,” the TensorFlow team wrote. There are three models that are already trained and optimized for mobile devices. MobileNet: A class of vision models able to identify across 1000 different object classes, specifically designed for efficient execution on mobile and embedded devices. Inception v3: An image recognition model, similar in functionality to MobileNet, that offers higher accuracy but also has a larger size. Smart Reply: An on-device conversational model that provides one-touch replies to incoming conversational chat messages. First-party and third-party messaging apps use this feature on Android Wear. TensorFlow Lite was redesigned from scratch to concentrate on three areas:
Lightweight: Enables inference of on-device machine learning models with a small binary size and fast initialization/start-up. Cross-platform: A runtime designed to run on many different platforms, starting with Android and iOS. Fast: Optimized for mobile devices, including dramatically improved model loading times, and supporting hardware acceleration.
“With this developer preview, we have intentionally started with a constrained platform to ensure performance on some of the most important common models,” a post authored by the TensorFlow team read. “We plan to prioritize future functional expansion based on the needs of our users. The goals for our continued development are to simplify the developer experience, and enable model deployment for a range of mobile and embedded devices.” The TensorFlow Lite Developer toolkit is now available in preview form on GitHub, along with code samples and demo applications. For those interested to know more about TensorFlow Lite, can check the documentation here. Source: 9TO5Google