This is to showcase object recognition capabilities of TensorFlow lite
This project create the initial structure of a detection app where you have three main screens. One screen is the main implementation to detect objects using the camera, where you can see the real time feedback to the user, the other screen which can keep a history of the results and keep a record of the items it detected and finally a screen where you can customize the parameters for detection algorithm.
This could be expanded to detect more products using custom models. Following trainings and materials highlights the steps to take.
- Custom object detection models using TensorFlow Lite https://developers.google.com/codelabs/tflite-object-detection-android#0
- Once you detect an object or product how to improve user experience by following the Material design guidance https://material.io/design/machine-learning/object-detection-live-camera.html#experience