Time-Interval-Guided Event Representation for Scene Understanding

基于时间间隔的事件表示法用于场景理解

阅读:1

Abstract

The recovery of scenes under extreme lighting conditions is pivotal for effective image analysis and feature detection. Traditional cameras face challenges with low dynamic range and limited spectral response in such scenarios. In this paper, we advocate for the adoption of event cameras to reconstruct static scenes, particularly those in low illumination. We introduce a new method to elucidate the phenomenon where event cameras continue to generate events even in the absence of brightness changes, highlighting the crucial role played by noise in this process. Furthermore, we substantiate that events predominantly occur in pairs and establish a correlation between the time interval of event pairs and the relative light intensity of the scene. A key contribution of our work is the proposal of an innovative method to convert sparse event streams into dense intensity frames without dependence on any active light source or motion, achieving the static imaging of event cameras. This method expands the application of event cameras in static vision fields such as HDR imaging and leads to a practical application. The feasibility of our method was demonstrated through multiple experiments.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。