Back to Search Start Over

FishEye8K: A Benchmark and Dataset for Fisheye Camera Object Detection

Authors :
Gochoo, Munkhjargal
Otgonbold, Munkh-Erdene
Ganbold, Erkhembayar
Hsieh, Jun-Wei
Chang, Ming-Ching
Chen, Ping-Yang
Dorj, Byambaa
Jassmi, Hamad Al
Batnasan, Ganzorig
Alnajjar, Fady
Abduljabbar, Mohammed
Lin, Fang-Pang
Publication Year :
2023

Abstract

With the advance of AI, road object detection has been a prominent topic in computer vision, mostly using perspective cameras. Fisheye lens provides omnidirectional wide coverage for using fewer cameras to monitor road intersections, however with view distortions. To our knowledge, there is no existing open dataset prepared for traffic surveillance on fisheye cameras. This paper introduces an open FishEye8K benchmark dataset for road object detection tasks, which comprises 157K bounding boxes across five classes (Pedestrian, Bike, Car, Bus, and Truck). In addition, we present benchmark results of State-of-The-Art (SoTA) models, including variations of YOLOv5, YOLOR, YOLO7, and YOLOv8. The dataset comprises 8,000 images recorded in 22 videos using 18 fisheye cameras for traffic monitoring in Hsinchu, Taiwan, at resolutions of 1080$\times$1080 and 1280$\times$1280. The data annotation and validation process were arduous and time-consuming, due to the ultra-wide panoramic and hemispherical fisheye camera images with large distortion and numerous road participants, particularly people riding scooters. To avoid bias, frames from a particular camera were assigned to either the training or test sets, maintaining a ratio of about 70:30 for both the number of images and bounding boxes in each class. Experimental results show that YOLOv8 and YOLOR outperform on input sizes 640$\times$640 and 1280$\times$1280, respectively. The dataset will be available on GitHub with PASCAL VOC, MS COCO, and YOLO annotation formats. The FishEye8K benchmark will provide significant contributions to the fisheye video analytics and smart city applications.<br />Comment: CVPR Workshops 2023

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.17449
Document Type :
Working Paper