Download PDFOpen PDF in browserSemantic-Pixel Associative Information Improving Loop Closure Detection and Experience Map Building for Efficient Visual RepresentationEasyChair Preprint 1113412 pages•Date: October 23, 2023AbstractRatSLAM is a brain-inspired simultaneous localization and mapping (SLAM) system based on the rodent hippocampus model, which is used to construct the experience map for environments. However, the map it constructs has the problems of low mapping accuracy and poor adaptability to changing lighting environments due to the simple visual processing method. In this paper, we present a novel RatSLAM system by using more complex semantic object information for loop closure detection (LCD) and experience map building, inspired by the effectiveness of semantic information for scene recognition in the biological brain. Specifically, we calculate the similarity between current and previous scenes in LCD based on the pixel information computed by the sum of absolute differences (SAD) and the semantic information extracted by the YOLOv2 network. Then we build an enhanced experience map with object-level information, where the 3D model segmentation technology is used to perform instance semantic segmentation on the recognized objects. By fusing complex semantic information in visual representation, the proposed model can successfully mitigate the impact of illumination and fully express the multi-dimensional information in the environment. Experimental results on the Oxford New College, City Center, and Lab datasets demonstrate its superior LCD accuracy and mapping performance, especially for environments with changing illumination. Keyphrases: Brain-inspired simultaneous localization and mapping, Semantic information., loop closure detection, semantic information
|