Real-time object detector on the edge using cells of interest instead of pixels with YOLIC
Real-time object detector on the edge using cells of interest instead of pixels with YOLIC
YOLIC: An Efficient Method for Object Localization and Classification on Edge Devices
arXiv paper abstract https://arxiv.org/abs/2307.06689
arXiv PDF paper https://arxiv.org/pdf/2307.06689.pdf
Project page https://kai3316.github.io/yolic.github.io
In ... Tiny AI, ... introduce "You Only Look at Interested Cells" (YOLIC), an efficient method for object localization and classification on edge devices.
Seamlessly blending the strengths of semantic segmentation and object detection, YOLIC offers superior computational efficiency and precision.
By adopting Cells of Interest for classification instead of individual pixels, YOLIC encapsulates relevant information, reduces computational load, and enables rough object shape inference.
Importantly, the need for bounding box regression is obviated, as YOLIC capitalizes on the predetermined cell configuration that provides information about potential object location, size, and shape.
To tackle the issue of single-label classification limitations, a multi-label classification approach is applied to each cell, effectively recognizing overlapping or closely situated objects.
... YOLIC achieves detection performance comparable to the state-of-the-art YOLO algorithms while surpassing in speed, exceeding 30fps on a Raspberry Pi 4B CPU ...
Please like and share this post if you enjoyed it using the buttons at the bottom!
Stay up to date. Subscribe to my posts https://morrislee1234.wixsite.com/website/contact
Web site with my other posts by category https://morrislee1234.wixsite.com/website
Comments