Masquerade attack on biometric hashing, which reconstructs the original biometric image from the given hashcode, has been given much attention recently. It is mainly used to validate the security of biometric recognition system or expand existing biometric databases like face or iris. However, an existing state-of-the-art method tends to ignore the perceptual quality of synthesized biometric images in the attack, and consequently, the synthetic images can be easily differentiated from real images. To obtain the high-perceptual-quality image which can simultaneously pass the validation of recognition system, we introduce a new target combining semantic invariability in hashing space and perceptual similarity in biometric space. In order to simulate the mapping from images to hashcodes and tackle the derivative problem related to discrete hashcodes in hashing space, we propose a DNN-based network named SimHashNet. Then we incorporate the SimHashNet into a generative adversarial network as our model named BiohashGAN to generate synthetic images form hashcodes. Experiment result on dataset CASIA-IrisV4.0-Interval and CMU PIE demonstrates that the synthetic images obtained from our model can pass the validation of recognition system and simultaneously maintain high perceptual quality. [ABSTRACT FROM AUTHOR]