Niu, WenjiaZhang, KaihaoLuo, WenhanZhong, YiranLi, Hongdong2023-12-080925-2312http://hdl.handle.net/1885/308888Current deep deblurring methods pay main attention to learning a transferring network to transfer synthetic blurred images to clean ones. Though achieving significant performance on the training datasets, they still suffer from a weaker generalization capability from training datasets to others with different synthetic blurs, thus resulting in significantly inferior performance on testing datasets. In order to alleviate this problem, we propose a latent contrastive model, Blur Distilling and Information Reconstruction Networks (BDIRNet), to learn image prior and improve the robustness of deep deblurring. The proposed BDIRNet consists of a blur removing network (DistillNet) and a reconstruction network (RecNet). Two kinds of images with almost the same information but different qualities are input into DistillNet to extract identical structure information via contrast latent information and purify the perturbations from other unimportant information like blur. While the RecNet is utilized to reconstruct sharp images based on the extracted information. In addition, inside the DistillNet and RecNet, a statistical anti-interference distilling (SAID) and anti-interference reconstruction (SAIR) modules are proposed to further enhance the robustness of our methods, respectively. Extensive experiments on different datasets show that the proposed methods achieve improved and robust results compared to recent state-of-the-art methods.This work is funded in part by the ARC Centre of Excellence for Robotics Vision (CE140100016), ARC-Discovery (DP190102261) and ARC-LIEF (190100080) grants, as well as a research grant from Baidu on autonomous driving. The authors gratefully acknowledge the GPUs donated by NVIDIA Corporation.application/pdfen-AU© 2021 Elsevier B.V.Image deblurDeep networkBlur distillingInformation comparisonDeep robust image deblurring via blur distilling and information comparison in latent space2021-09-2910.1016/j.neucom.2021.09.0192022-09-04