[TF Lite] Working with TensorFlow Lite on Android with C++
一、概况
会议记录:https://conferences.oreilly.com/tensorflow/tf-ca-2019/public/schedule/detail/78543 by Joe Bowser
There are many cases where developers on mobile write lower-level C++ code for their Android applications using the Android NDK, OpenCV and other technologies. Joe Bowser explores:
- 
- how to use TF Lite’s C++ API on Android with existing code so the code can interact directly with TF Lite
- without having to make a round trip through Java Native Interface (JNI) and the Android subsystem,
- allowing for cleaner, more portable code so that it can even be used in iOS or other platforms.
- You’ll also discover common pitfalls when working with TFLite as a C++ library, using TFLite with OpenCV and/or Halide on Android, as well as some techniques to do integration testing to allow your tests to work in a CI/CD environment.
 
What you'll learn
Discover with the pros and cons of various approaches to using TensorFlow Lite in a production environment and whether using Java or C++ is the best choice for your project
二、底层优化
Ref: [ARM] Arm Compute Library for computer vision and machine learning
了解OpenCV与TFLite中的优化程度发展到了何等程度呢。
三、混合集成
之所以混合集成,是希望能最大化利用“硬件优化”。
Hardware: Coral Dev Board
Goto: https://aiyprojects.withgoogle.com/edge-tpu/
ACL C++ on Linux
/* implement */
TFLite C++ on Linux
TFLite based on OpenGL ES provides better performance, more details: [AR] TensorFlow Lite with GPU
Ref: Real Computer Vision for mobile and embedded. Part 2. Select the right tool.
- Performance: Initially this framework was created for ML inference on the embedded and low-end hardware devices. So the main resource for this library is CPU. It means that “big”-capacity models can be executed with low performance and consume a lot of battery. Also such kind of operations can lead to overheating of the phone. 
 With all the above disadvantages TFLite is almost the only tool which can be used for all variety of Android ARM devices. It uses all possible optimizations to run your model efficiently on-device and it can be enough for many Android ML apps.
 P.S. In the experimental branch of TF Lite lib you can find GPU acceleration support through Open GL technologies. It shows good results for the latest phone models.
- ML operations (layers) capability: It looks similar to iOS description. Good idea to use the Tensor Flow framework for server-side training and the official convertor.
- Hardware specifications: Even there are thousands of phone models we still have a limited amount of CPU architectures. 99 percents of the market are ARM-based gadgets. TF Lite uses union, CPU efficient instructions (such NEON) for ML inference.
 
                     
                    
                 
                    
                
 
                
            
         
         浙公网安备 33010602011771号
浙公网安备 33010602011771号