论文:2019,Vol:37,Issue(2):315-322
引用本文:
张光华, 王福豹, 段渭军. 基于GAN的天文图像生成研究[J]. 西北工业大学学报
ZHANG Guanghua, WANG Fubao, DUAN Weijun. Study on Star-Galaxy Image Generation Method Based on GAN[J]. Northwestern polytechnical university

基于GAN的天文图像生成研究
张光华, 王福豹, 段渭军
西北工业大学 电子信息学院, 陕西 西安 710072
摘要:
生成对抗网络(GAN)被广泛应用于图像生成。生成恒星和星系图像对预测未知恒星和星系有着重要的意义。首次将GAN用于生成天文图像,给出了天文图像生成的GAN模型结构;设计了GAN训练的策略;为了提高GAN的稳定性,提出了改进的神经元抛弃方法, 通过网格搜索法对模型中的部分高级参数进行了优化,并采用了韦氏距离对损失函数进行了改进。以斯隆数字巡天数据库(SDSS)中的恒星以及星系图像作为训练图像,采用改进方法和原始GAN分别生成了2种不同分辨率的恒星和星系图像,并进行了对比,验证了改进方法的有效性。
关键词:    生成对抗网络    恒星和星系图像    训练稳定    损失函数   
Study on Star-Galaxy Image Generation Method Based on GAN
ZHANG Guanghua, WANG Fubao, DUAN Weijun
School of Electronics and Information, Northwestern Polytechnical University, Xi'an 710072, China
Abstract:
GAN technology has been widely used in image generation field. Generating images of stars and galaxy is of great significance for the prediction of unknown stars and galaxy. GAN has been used to generate star-galaxy images in this paper; the GAN model structure was built; the training strategy for GAN was designed; in order to stabilize the training procedure, we proposed a gird search method for the optimization of several hyper-parameters and an improved neuron discard method, EM-distance was used to modify the loss function in original GAN model. Taking the star-galaxy images in the Sloan digital sky survey (SDSS) as the training dataset, the improved method proposed in this paper and the original GAN were respectively used to generate two kinds of stars and galaxy images with different resolutions, and the comparison has been made to verify the effectiveness of the improved method.
Key words:    generative adversarial neural network    images of stars and galaxies    stabilized training    loss function   
收稿日期: 2018-05-09     修回日期:
DOI: 10.1051/jnwpu/20193720315
通讯作者:     Email:
作者简介: 张光华(1989-),西北工业大学博士研究生,主要从事深度学习及图像生成研究。
相关功能
PDF(2215KB) Free
打印本文
把本文推荐给朋友
作者相关文章
张光华  在本刊中的所有文章
王福豹  在本刊中的所有文章
段渭军  在本刊中的所有文章

参考文献:
[1] COHEN M F, GREENBERG D P. The Hemi-Cube:a Radiosity Solution for Complex Environments[J]. ACM Siggraph Computer Graphics, 1985, 19(3):31-40
[2] YOSHIO O, YAMAZAWA K, TAKEMURA H, et al. Telepresence by Real-Time View-Dependent Image Generation from Omnidirectional Video Streams[J]. Computer Vision and Image Understanding, 1998, 71(2):154-165
[3] KRISHNAN A, AHUJA N. Panoramic Image Acquisition[C]//CVPR IEEE, 1996:379
[4] LAWRENCE S, GILES C L, TSOI A C, et al. Face Recognition:a Convolutional Neural-Network Approach[J]. IEEE Trans on Neural Networks, 1997, 8(1):98-113
[5] YANN L, BOTTOU L, BENGIO Y, et al. Gradient-Based Learning Applied to Documentrecognition[J]. Proceedings of the IEEE, 1998, 86(11):2278-2324
[6] YANN L, BOSER B, DENKER J S, et al. Backpropagation Applied to Handwritten Zip Code Recognition[J]. Neural Computation, 1989, 1(4):541-551
[7] ZHANG L, TAM W J. Stereoscopic Image Generation Based on Depth Images for 3D TV[J]. IEEE Trans on Broadcasting, 2005, 51(2):191-199
[8] PARK J H, BAASANTSEREN G, KIM N, et al. View Image Generation in Perspective and Orthographic Projection Geometry Based on Integral Imaging[J]. Optics Express, 2008, 16(12):8800-8813
[9] GOODFELLOW I, POUGET-ABADIE J, MIRZA M, et al. Generative Adversarial Nets[C]//Advances in Neural Information Processing Systems, 2014:2672-2680
[10] BENGIO Y, LAUFER E, ALAIN G, et al. Deep Generative Stochastic Networks Trainable By Backprop[C]//International Conference on Machine Learning, 2014:226-234
[11] JARRETT K, KAVUKCUOGLU K, LECUN Y. What is the Best Multi-Stage Architecture for Object Recognition?[C]//IEEE 12th International Conference on Computer Vision, 2009:2146-2153
[12] CHEN X, DUAN Y, HOUTHOOFT R, et al. Infogan:Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets[C]//Advances in Neural Information Processing Systems, 2016:2172-2180
[13] ISOLA P, ZHU J Y, ZHOU T, et al. Image-to-Image Translation with Conditional Adversarial Networks[C]//Proceddings of the IEEE Conference on Computer Vision and Pattern Recognition, 2017
[14] KARRAS T, AILA T, LAINE S, et al. Progressive Growing of Gans for Improved Quality, Stability, and Variation[EB/OL]. (2017-10-27)[2018-05-09]. https://arxiv.org/abs/1710.10196
[15] RADFORD A, METZ L, CHINTALA S. Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks[EB/OL]. (2015-11-19)[2018-05-09]. https://arxiv.org/abs/1511.06434
[16] LIU M Y, TUZEL O. Coupled Generative Adversarial Networks[C]//Advances in Neural Information Processing Systems, 2016:469-477
[17] DENTON E L, CHINTALA S, FERGUS R. Deep Generative Image Models Using Anlaplacian Pyramid of Adversarial Networks[C]//Advances in Neural Information Processing Systems, 2015:1486-1494
[18] ARJOVSKY M, CHINTALA S, BOTTOU L. Wasserstein Generative Adversarial Networks[C]//International Conference on Machine Learning, 2017:214-223
[19] MIRZA M, OSINDERO S. Conditional Generative Adversarial Nets[EB/OL]. (2014-11-6)[2018-05-09]. https://arxiv.org/abs/1411.1784
[20] DZIUGAITE G K, ROY D M, GHAHRAMANI Z. Training Generative Neural Networks Via Maximum Mean Discrepancy Optimization[EB/OL]. (2015-05-14)[2018-05-09]. https://arxiv.org/abs/1505.03906
[21] IOFFE S, SZEGEDY C. Batch Normalization:Accelerating Deep Network Training by Reducing Internal Covariateshift[EB/OL]. (2015-02-11)[2018-05-09]. https://arxiv.org/abs/1502.03167
[22] CLEVERT D A, UNTERTHINER T, HOCHREITER S. Fast and Accurate Deep Network Learning By Exponential Linear Units[EB/OL]. (2015-11-23)[2018-05-09]. https://arxiv.org/abs/1511.07289
[23] SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout:a Simple Way to Prevent Neural Networks from Overfitting[J]. The Journal of Machine Learning Research, 2014, 15(1):1929-1958
[24] YORK D G, ADELMAN J, ANDERSON JR J E, et al. The Sloan Digital Sky Survey:Technical Summary[J]. The Astronomical Journal, 2000, 120(3):1579-1587
[25] SMITH J A, TUCKER D L, KENT S, et al. The Ugriz Standard-Star System[J]. The Astronomical Journal, 2002, 123:2121-2144
[26] KINGMA D P, BA J. Adam:A Method for Stochastic Optimization[EB/OL]. (2014-12-22)[2018-05-09]. https://arxiv.org/abs/1412.6980
[27] LIN J. Divergence Measures Based on the Shannon Entropy[J]. IEEE Trans on Lnformation Theory, 1991, 37(1):145-151
[28] JOYCE J M. Kullback-Leiblerdivergence[M]. Berlin, International Encyclopedia of Statistical Science Springer, 2011:720-722