The underwater environment is poorly lit,and it is difficult to image clearly with natural light sources alone. Artificial light sources are necessary to assist the illumination especially for underwater environment. However,inappropriate artificial lighting will lead to uneven brightness of scenes. The quality of the underwater images taken in this hybrid lighting environment with both natural and artificial light sources is severely degraded, which not only affects visual perception,but also poses a challenge to the successful execution of subsequent high-level computer vision tasks. However,most of the existing methods only consider the influence of natural light sources,and are not effective in recovering underwater images under hybrid light source environment. To handle the problems of uneven illumination,color bias and blurry details in underwater images under hybrid light source environments,we propose an Illumination-aware Encoder-Decoder Network(IEDN)for underwater image restoration. On one hand,attention mechanism and enhanced residual block are incorporated into a multi-scale structure to effectively extract detailed structure features. On the other hand,the illumination aware map is introduced as a prior constraint to balance the contrast of the restoration results. Meanwhile,appropriate loss functions are designed to guide the network to fully learn the nonlinear mapping relationship between the underwater image and the ground truth. In this way,the tone of the restored image is more natural and the texture details are more plentiful. The results of the comparison experiments prove that the proposed method is superior to other algorithms both quantitatively and qualitatively. The ablation experiments further demonstrate the effectiveness of the network modules and the illumination aware.
[1] ANCUTI C O,ANCUTI C,DE VLEESCHOUWER C,et al.Color balance and fusion for underwater imageenhancement[J].IEEE Transactions on ImageProcessing,2017,27(1):379-393.
[2] LI C Y,GUO J C,CONG R M,et al.Underwater imageenhancement by dehazing with minimum informationloss and histogram distribution prior[J].IEEETransactions on Image Processing,2016,25(12):5664-5677.
[3] ISLAM M J,XIA Y Y,SATTAR J.Fast underwaterimage enhancement for improved visual perception[J].IEEE Robotics and Automation Letters,2020,5(2):3227-3234.
[5] MA Z Y,OH C J.A wavelet-based dual-stream networkfor underwater image enhancement[C]//ICASSP2022-2022 IEEE International Conference onAcoustics,Speech and Signal Processing(ICASSP).Online:IEEE,2022.
[6] NAIK A,SWARNAKAR A,MITTAL K.ShallowUWNet:compressed model for underwater imageenhancement(student abstract)[C]//Proceedings ofthe AAAI Conference on Artificial Intelligence.Online:AAAI,2021.
[7] LIU S B,FAN H J,LIN S,et al.Adaptive learningattention network for underwater image enhancement[J].IEEE Robotics and Automation Letters,2022,7(2):5326-5333.
[8] FAN G D,FAN B,GAN M,et al.Multiscale low-lightimage enhancement network with illuminationconstraint[J].IEEE Transactions on Circuits andSystems for Video Technology,2022,32(11):7403-7417.
[9] DESPLANQUES B,THIENPONDT J,DEMUYNCKK.Ecapa-TDNN:emphasized channel attention,propagation and aggregation in TDNN based speakerverification[EB/OL].[2022-12-11].https://arxiv.org/abs/2005.07143.
[10] WOO S H,PARK J C,LEE J Y,et al.CBAM:convolutional block attention module[C]//Proceedingsof the European Conference on Computer Vision(ECCV).Munich:ECCV,2018.
[11] DAI Y M,GIESEKE F,OEHMCKE S,et al.Attentional feature fusion[C]//Proceedings of theIEEE/CVF Winter Conference on Applications ofComputer Vision.Santiago:WACV,2021.
[12] GIRSHICK R.Fast R-CNN[C]//Proceedings of theIEEE International Conference on Computer Vision.Vancouver:IEEE,2015.
[13] ISLAM M J,ENAN S S,LUO P,et al.Underwaterimage super-resolution using deep residualmultipliers[C]//2020 IEEE International Conference onRobotics and Automation(ICRA).Online:IEEE,2020.
[14] SIMONYAN K,ZISSERMAN A.Very deepconvolutional networks for large-scale image recognition[EB/OL].[2022/12/11].https://arxiv.org/abs/1409.1556.
[15] LUO M R,CUI G,RIGG B.The development of theCIE 2000 colour-difference formula:CIEDE2000[J].Color Research & Application,2001,26(5):340-350.