Please use this identifier to cite or link to this item:
http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/8416
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Rajput, Amitesh Singh | - |
dc.date.accessioned | 2023-01-09T11:17:35Z | - |
dc.date.available | 2023-01-09T11:17:35Z | - |
dc.date.issued | 2018 | - |
dc.identifier.uri | https://www.springerprofessional.de/en/fire-detection-using-dense-trajectories/15740418 | - |
dc.identifier.uri | http://dspace.bits-pilani.ac.in:8080/xmlui/handle/123456789/8416 | - |
dc.description.abstract | This paper proposes an automatic computer vision-based system for fire detection in videos. There are many previous methods for video-based fire detection but only very few of them have considered the challenge of camera motion or motion of the background scene while finding features based on motion of fire. Our method is divided into two phases. First, we train our system for color characteristics of fire with the help of Gaussian mixture model (GMM), and for texture features which are computed using local binary patterns (LBPs). Next, dense trajectories are computed for motion features which are free from camera motion or challenges of moving scene. Bounding boxes are detected with the help of color and texture models. Subsequently, dense trajectories are projected onto codebooks for feature vector computation, and chi-square kernel-based SVM is employed for classification of fire and non-fire motion representations. Quantitative evaluation of our method indicates the fitness of temporal features for fire detection. | en_US |
dc.language.iso | en | en_US |
dc.publisher | Springer | en_US |
dc.subject | Computer Science | en_US |
dc.subject | Fire Detection | en_US |
dc.title | Fire Detection Using Dense Trajectories | en_US |
dc.type | Article | en_US |
Appears in Collections: | Department of Computer Science and Information Systems |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.