posted on 2019-06-27, 15:18authored byMasoumeh Zareapoor, Huiyu Zhou, Jie Yang
Generative adversarial networks have received a remarkable success in many computer vision applications for their ability to learn from complex data distribution. In particular, they are capable to generate realistic images from latent space with a simple and intuitive structure. The main focus of existing models has been improving the performance; however, there is a little attention to make a robust model. In this paper, we investigate solutions to the super-resolution problems—in particular perceptual quality—by proposing a robust GAN. The proposed model unlike the standard GAN employs two generators and two discriminators in which, a discriminator determines that the samples are from real data or generated one, while another discriminator acts as classifier to return the wrong samples to its corresponding generators. Generators learn a mixture of many distributions from prior to the complex distribution. This new methodology is trained with the feature matching loss and allows us to return the wrong samples to the corresponding generators, in order to regenerate the real-look samples. Experimental results in various datasets show the superiority of the proposed model compared to the state of the art methods.
Funding
This research is partly supported by NSFC, China (U1803261, 61876107, 61572315); 973 Plan, China (2015CB856004). H. Zhou was supported by UK EPSRC under Grant EP/N011074/1, Royal Society-Newton Advanced Fellowship under Grant NA160342 and European Union’s Horizon 2020 research and innovation program under the Marie-Sklodowska-Curie grant agreement No 720325.
History
Citation
Neural Computing and Applications, 2019
Author affiliation
/Organisation/COLLEGE OF SCIENCE AND ENGINEERING/Department of Informatics
The file associated with this record is under embargo until 12 months after publication, in accordance with the publisher's self-archiving policy. The full text may be available through the publisher links provided above.