Skip to content

HungLM1506/Detection-Tiny-Face-with-Retinaface_pytorch

Repository files navigation

RetinaFace in PyTorch

A PyTorch implementation of RetinaFace: Single-stage Dense Face Localisation in the Wild. Model size only 1.7M, when Retinaface use mobilenet0.25 as backbone net. We also provide resnet50 as backbone net to get better result. The official code in Mxnet can be found here.

Data

  1. Organise the dataset directory as follows:
  ./data/widerface/
    train/
      images/
      label.txt
    val/
      images/
      wider_val.txt
  1. We also provide the organized dataset we used as in the above directory structure Link: from google cloud or baidu cloud Password: ruck

Installation

git clone https://github.com/HungLM1506/Detection-Tiny-Face-with-RetinaFace_pytorch.git
cd Pytorch_Retinaface
pip install -r requirement.txt

Training

We provide restnet50 and mobilenet0.25 as backbone network to train model. We trained Mobilenet0.25 on imagenet dataset and get 46.58% in top 1. If you do not wish to train the model, we also provide trained model. Pretrain model and trained model are put in Here . The model could be put as follows:

  ./weights/
      mobilenet0.25_Final.pth
      mobilenetV1X0.25_pretrain.tar
      Resnet50_Final.pth

Run

python detect.py --trained_model weights/Resnet50_Final.pth or Mobilenet0.25_Final.pth --network resnet50 or mobile0.25

Evaluation widerface val

  1. Generate txt file
python test_widerface.py --trained_model weight_file --network mobile0.25 or resnet50
  1. Evaluate txt results. Demo come from Here
cd ./widerface_evaluate
python setup.py build_ext --inplace
python evaluation.py

References

@inproceedings{deng2019retinaface,
title={RetinaFace: Single-stage Dense Face Localisation in the Wild},
author={Deng, Jiankang and Guo, Jia and Yuxiang, Zhou and Jinke Yu and Irene Kotsia and Zafeiriou, Stefanos},
booktitle={arxiv},
year={2019}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published