Yolov5 TFLite — Inferencing in Mobile Devices

Draden Liang Han Sheng
4 min readJan 13, 2022


Learning Outcome:

  1. Train custom Yolov5 model
  2. Convert Pytotch.pt model file to TFLite.tflite
  3. Add metdata to the TFLite model


Python>=3.6.0 and PyTorch>=1.7 are required :

git clone https://github.com/Techyhans/yolov5-portholes.git
cd yolov5

You can use venv or conda based on your preference, then install the required dependencies:

pip install -r requirements.txt

Train custom Yolov5 model

As an example, let’s use this public dataset from roboflow, North American Mushrooms Dataset (Yolov5 Pytorch version). Unzip and place the folder under data/ folder of yolov5 folder. For simplicity, let’s rename North American Mushrooms.v1-416x416.yolov5pytorch/ to sample/.

If you are annotating the images yourself, check out ultralytics Train-Custom-Data to prepare your data and labels. Remember to change the config according to your dataset in the .ymal file.

If you are using the mushroom dataset, replace the content in data.ymal file:

path: data/sample
train: train/images
val: valid/images
test: test/images
nc: 2
names: ['CoW', 'chanterelle']

Start your training by specifying the following arguments:

  • --img: image size
  • --batch: batch size
  • --epochs: epochs number
  • --data: dataset .yaml files which list the config
  • --weights yolov5s.pt: pretrained weights (recommended)
  • or --weights '' --cfg yolov5s.yaml: randomly initialized


python train.py --img 640 --batch 16 --epochs 300 --data sample/data.yaml --weights yolov5s.pt

Your train results will be saved at runs/train/exp<no.> . You can find your best and last weights under weights/inside the folder.

Let’s move the Pytorch model file (Eg. best.pt) ou want to convert to TFLite model to the root folder (yolov5/).

Convert Pytorch (.pt) model file to TFLite (.tflite)

The current ultralytics yolov5 Github repo (by 30 Oct 2021) does not support this conversion for object detection model that is able to add metadata and used on android later.

The reason is YOLOv5 exported models generally concatenate outputs into a single output. TFLite models do not export with NMS, only TF.js and pipelined CoreML models contains NMS.

You can see more details of all the limitations and experiments in this discussion. However, there are still some options you can export TFLite that works on Android:

In my Github repo, I have incorporated the second option, TFLite with NMS, which you can export a TFLite model using the command below:

python export.py --weights best.pt --include tflite --tf-nms --agnostic-nms

You would have an output tensor of dimension as follows (source):

Four Outputs

  • detection_boxes: a float32 tensor of shape [1, num_boxes, 4] with box locations
  • detection_classes: a float32 tensor of shape [1, num_boxes] with class indices
  • detection_scores: a float32 tensor of shape [1, num_boxes] with class scores
  • num_boxes: a float32 tensor of size 1 containing the number of detected boxes

Metadata Writer

I provided two versions of Metadata writer in my Github repo which

  • V1 attaches the model default name and description
  • V2 allows you to specify your model name and description
  1. The first version is from TensorFlow metadata writer tutorial, it attaches the default name and description as below.
"name": "ObjectDetector",
"description": "Identify which of a known set of objects might be present and provide information about their positions within the given image or a video stream."

Start generating the metadata by specifying the following arguments:

  • -- model_file: TFLite model
  • --label_file: A text file that lists the labels


python metadata_writer_v1.py --model_file best-fp16.tflite --label_file labels.txt

2. If you want to change the model name and description, you can use the second version metadata_writer_v2.py.

Update the following in the python file according to your model details.

# Your model details here
model_path = ‘best-fp16.tflite’
label_path = ‘labels.txt’
model_meta.name = “Model name”
model_meta.description = (
“decription line …”
“decription line …”

Then, generate the TFLite with metadata:

python metadata_writer_v2.py


You already have a TFLite model to put in your android app! Check out the Android quickstart tutorial by Tensoflow to deploy it.


You can find the code all above at this site.


  1. https://github.com/ultralytics/yolov5
  2. https://github.com/zldrobit/yolov5/tree/tf-android-flex-ops-class-agnostic-nms
  3. https://github.com/ultralytics/yolov5/discussions/2095
  4. https://tensorflow.google.cn/lite/convert/metadata
  5. https://tensorflow.google.cn/lite/convert/metadata_writer_tutorial#object_detectors

About Author

This article is written by Han Sheng, Technical Lead in Arkmind, Malaysia. He has a passion for Software Design/Architecture related stuff, Computer Vision and also Edge Devices. He made several AI-based Web/Mobile Applications to help clients solving real-world problems. Feel free to read about him via his Github profile.



Draden Liang Han Sheng

Full Stack AI Application Development | Computer Vision | Deep Learning | Edge Devices