This repository is the official PyTorch implementation of the following paper. Our code can reproduce the training and testing results reported in the paper.
RAPiD: Rotation-Aware People Detection in Overhead Fisheye Images
[arXiv paper] [Project page]
Requirements: The code should be able to work as long as you have the following packages:
- PyTorch >= 1.0. Installation instructions can be found at https://pytorch.org/get-started/locally/
- opencv-python
- pycocotools (for Windows users, please refer to this repo)
- tqdm
- tensorboard (optional, only for training)
An exmpale of Installation with Linux, CUDA10.1, and Conda:
conda create --name RAPiD_env python=3.7
conda activate RAPiD_env
conda install pytorch torchvision cudatoolkit=10.1 -c pytorch
conda install -c conda-forge pycocotools
conda install tqdm opencv
# cd the_folder_to_install
git clone https://github.com/duanzhiihao/RAPiD.git
Below is the cross-validatation performance on three datasets: Mirror Worlds-rotated bbox version, HABBOF, and CEPDOF. The metric being used is Average Precision at IoU=0.5 (AP0.5). The links in the table refer to the pre-trained network weights that can reproduce each number.
Resolution | MW-R | HABBOF | CEPDOF |
---|---|---|---|
608 | 96.6 | 97.3 | 82.4 |
1024 | 96.7 | 98.1 | 85.8 |
- Clone the repository
- Download the pre-trained network weights, which is trained on COCO, MW-R and HABBOF, and place it under the RAPiD/weights folder.
- Directly run
python example.py
. Alternatively,demo.ipynb
gives an example using jupyter notebook.
TBD
TBD
TBD
- Update README
RAPiD source code is available for non-commercial use. If you find our code and dataset useful or publish any work reporting results using this source code, please consider citing our paper
Z. Duan, M.O. Tezcan, H. Nakamura, P. Ishwar and J. Konrad,
“RAPiD: Rotation-Aware People Detection in Overhead Fisheye Images”,
in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR),
Omnidirectional Computer Vision in Research and Industry (OmniCV) Workshop, June 2020.