Uğur Güdükbay's Publications

Sorted by DateClassified by Publication TypeClassified by Research Category

Deep Convolutional Generative Adversarial Networks for Flame Detection in Video

Süleyman Aslan, Uğur Güdükbay, B. Uğur Töreyin, and A. Enis Çetin. Deep Convolutional Generative Adversarial Networks for Flame Detection in Video. In Computational Collective Intelligence, pp. 807–815, CCI '20, Lecture Notes in Computer Science (LNCS) 12496, Springer International Publishing, Cham, November 2020.

Download

[PDF] 

Abstract

Real-time flame detection is crucial in video-based surveillance systems. We propose a vision-based method to detect flames using Deep Convolutional Generative Adversarial Neural Networks (DCGANs). Many existing supervised learning approaches using convolutional neural networks do not take temporal information into account and require a substantial amount of labeled data. To have a robust representation of sequences with and without flame, we propose a two-stage training of a DCGAN exploiting spatio-temporal flame evolution. Our training framework includes the regular training of a DCGAN with real spatio-temporal images, namely, temporal slice images, and noise vectors, and training the discriminator separately using the temporal flame images without the generator. Experimental results show that the proposed method effectively detects flame in video with negligible false-positive rates in real-time.

BibTeX

@InProceedings{AslanEtAl2020,
author="S{\"u}leyman Aslan and U{\^g}ur G{\"u}d{\"u}kbay and B. U{\^g}ur T{\"o}reyin and A. Enis {\c C}etin",
editor="Nguyen, Ngoc Thanh
and Hoang, Bao Hung
and Huynh, Cong Phap
and Hwang, Dosam
and Trawi{\'{n}}ski, Bogdan
and Vossen, Gottfried",
title="Deep Convolutional Generative Adversarial Networks for Flame Detection in Video",
booktitle="Computational Collective Intelligence", 
series = {CCI '20, Lecture Notes in Computer Science (LNCS)},
volume = 12496,
year="2020",
month = {November},
publisher="Springer International Publishing",
address="Cham",
pages="807--815",
abstract="Real-time flame detection is crucial in video-based surveillance systems. 
          We propose a vision-based method to detect flames using Deep Convolutional 
		  Generative Adversarial Neural Networks (DCGANs). Many existing supervised 
		  learning approaches using convolutional neural networks do not take temporal
		  information into account and require a substantial amount of labeled data. 
		  To have a robust representation of sequences with and without flame, we 
		  propose a two-stage training of a DCGAN exploiting spatio-temporal flame
		  evolution. Our training framework includes the regular training of a DCGAN
		  with real spatio-temporal images, namely, temporal slice images, and noise
		  vectors, and training the discriminator separately using the temporal flame
		  images without the generator. Experimental results show that the proposed 
		  method effectively detects flame in video with negligible false-positive 
		  rates in real-time.",
isbn="978-3-030-63007-2",
bib2html_dl_pdf = "http://www.cs.bilkent.edu.tr/~gudukbay/publications/papers/conf_papers/Aslan_Et_Al_ICCCI_2020.pdf",
bib2html_pubtype = {Refereed Conference Papers},
bib2html_rescat = {Computer Vision}
}

Generated by bib2html.pl (written by Patrick Riley ) on Sun Apr 21, 2024 11:32:41