DSpace logo

Please use this identifier to cite or link to this item: http://dspace.bits-pilani.ac.in:8080/jspui/xmlui/handle/123456789/8634
Full metadata record
DC FieldValueLanguage
dc.contributor.authorGhosal, Sugata-
dc.date.accessioned2023-01-21T07:21:22Z-
dc.date.available2023-01-21T07:21:22Z-
dc.date.issued1996-
dc.identifier.urihttps://ieeexplore.ieee.org/document/517151-
dc.identifier.urihttp://dspace.bits-pilani.ac.in:8080/xmlui/handle/123456789/8634-
dc.description.abstractAutomatic target recognition (ATR) applications require simultaneously a wide field of view (FOV) for better detection and situation awareness, high resolution for target recognition and threat assessment, and high frame rate for detecting brief events and disambiguating frame-to-frame correlation. Uniformly sampling the entire FOV at recognition resolution is simply wasteful in ATR scenarios with localized regions of interest (ROIs). Foveal data acquisition with space-variant sampling and context-sensitive sensor articulation is highly optimized for active ATR applications. We propose a multiscale local Zernike filter-based front end target detection technique for a commercially feasible foveal sensor topology with piecewise constant resolution profile. Anisotropic heat diffusion is employed for preprocessing of the foveal data. Expansion template matching is used to derive a detection filter that optimizes the discriminant signal-to-noise ratio (SNR). Results are presented with simulated foveal imagery, derived from real uniform acuity FLIR data.en_US
dc.language.isoenen_US
dc.publisherIEEEen_US
dc.subjectComputer Scienceen_US
dc.subjectObject detectionen_US
dc.subjectTarget recognitionen_US
dc.subjectEvent detectionen_US
dc.subjectAnisotropic magnetoresistanceen_US
dc.titleTarget detection in foveal ATR systemsen_US
dc.typeArticleen_US
Appears in Collections:Department of Computer Science and Information Systems

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.