Fully Automated Person ReID (FAPR)

Most existing datasets are only utilized in one of three considered tasks. Therefore, in this study, a dataset is built by our own for evaluating the performance of each step in the fully automated person ReID system, called Fully Automated Person ReID (FAPR) dataset. This dataset contains total 15 videos and is recorded on three days by two static non-overlapping cameras with HD resolution (1920 × 1080), at 20 frames per second (fps) in indoor and outdoor environment conditions. Some descriptions about this dataset is shown in Table 1 in different terms: #Images, #Bounding boxes, #BB/Images, #IDs, #Tracklets . Some characteristics of this dataset are described as follows.

Firstly, due to the limitation of observation environment, the distances from pedestrians to cameras are not far (about from 2 meters to 8 meters). This leads to strong variation in human body scale in a captured image. Secondly, the border area of each extracted image is blurred because of pedestrian movement and low quality of surveillance cameras. The blurred phenomenon also causes a great difficulty for human detection as well as tracking steps. Thirdly, two cameras are installed to observe pedestrians horizontally. Lastly, as above mentioned, this dataset is captured in both indoor and outdoor environments. The videos captured in indoor are suffered from neon light, while outdoor videos are collected without daylight with heavy shadow. Especially, three videos (20191105 indoor left, 20191105 indoor right, 20191105 indoor cross) are captured by sunlight which cause noise for all steps. All characteristics mentioned above make this dataset also contains common challenges as existing datasets used for human detection, tracking and person ReID. In order to generate the ground truth for human detection evaluation, bounding boxes are manually created by LabelImg tool which is the widely used tool for image annotation. Five annotators have prepared all groundtruth for person detection, tracking and re-identification

Details

The detailed statistics of FAPR dataset are shown in the table below:

Videos #Image #Bounding boxes #BB/Image #IDs #Tracklets
indoor 489 1153 2.36 7 7
outdoor easy 1499 2563 1.71 6 7
outdoor hard 2702 6552 2.42 8 20
20191104 indoor_left 363 1287 3.55 10 10
20191104 indoor_right 440 1266 2.88 10 13
20191104 indoor_cross 240 1056 4.40 10 10
20191104_outdoor_left 449 1333 2.97 10 10
20191104_outdoor_right 382 1406 3.68 10 11
20191104 outdoor_cross 200 939 4.70 10 12
20191105 indoor_left 947 1502 1.59 10 11
20191105 indoor_right 474 1119 2.36 10 10
20191105 indoor_cross 1447 3087 2.13 10 21
20191105_outdoor_left 765 1565 2.05 11 11
20191105 outdoor_right 470 1119 2.38 10 11
20191105 outdoor_cross 1009 2620 2.60 9 17

Terms & Conditions of Use

The datasets are released for academic research only, and are free to researchers from educational or research institutes for non-commercial purposes.

Related Publications

All publications using VnBeeTracking or any of the derived datasets should cite the following papers:

  1. Hong-Quan Nguyen, Thuy-Binh Nguyen, Duc-Long Tran, Thi-Lan Le. A UNIFIED FRAMEWORK FOR AUTOMATED PERSON RE-IDENTIFICATION. Transport and Communications Science Journal, Vol. 71, Issue 7 (09/2020), 868-880.

  2. Nguyen, Hong-Quan, Thuy-Binh Nguyen, T. A. Le, Thi-Lan Le, Thanh-Hai Vu, and Alexis Noe. Comparative evaluation of human detection and tracking approaches for online tracking applications. In 2019 International Conference on Advanced Technologies for Communications (ATC), IEEE, pp. 348-353. 2019.

Download

The requestor must sign in the commitment and send it to the database administrator (lan.lethi1@hust.edu.vn) by email.