Underwater scene prior inspired deep underwater image and video enhancement

•Underwater image and video synthesis approach is desired by data-driven methods.•Underwater scene prior is helpful for underwater image and video enhancement.•Light-weight network structure can be easily extended to underwater video. In underwater scenes, wavelength-dependent light absorption and s...

Full description

Saved in:
Bibliographic Details
Published inPattern recognition Vol. 98; p. 107038
Main Authors Li, Chongyi, Anwar, Saeed, Porikli, Fatih
Format Journal Article
LanguageEnglish
Published Elsevier Ltd 01.02.2020
Subjects
Online AccessGet full text

Cover

Loading…
Abstract •Underwater image and video synthesis approach is desired by data-driven methods.•Underwater scene prior is helpful for underwater image and video enhancement.•Light-weight network structure can be easily extended to underwater video. In underwater scenes, wavelength-dependent light absorption and scattering degrade the visibility of images and videos. The degraded underwater images and videos affect the accuracy of pattern recognition, visual understanding, and key feature extraction in underwater scenes. In this paper, we propose an underwater image enhancement convolutional neural network (CNN) model based on underwater scene prior, called UWCNN. Instead of estimating the parameters of underwater imaging model, the proposed UWCNN model directly reconstructs the clear latent underwater image, which benefits from the underwater scene prior which can be used to synthesize underwater image training data. Besides, based on the light-weight network structure and effective training data, our UWCNN model can be easily extended to underwater videos for frame-by-frame enhancement. Specifically, combining an underwater imaging physical model with optical properties of underwater scenes, we first synthesize underwater image degradation datasets which cover a diverse set of water types and degradation levels. Then, a light-weight CNN model is designed for enhancing each underwater scene type, which is trained by the corresponding training data. At last, this UWCNN model is directly extended to underwater video enhancement. Experiments on real-world and synthetic underwater images and videos demonstrate that our method generalizes well to different underwater scenes.
AbstractList •Underwater image and video synthesis approach is desired by data-driven methods.•Underwater scene prior is helpful for underwater image and video enhancement.•Light-weight network structure can be easily extended to underwater video. In underwater scenes, wavelength-dependent light absorption and scattering degrade the visibility of images and videos. The degraded underwater images and videos affect the accuracy of pattern recognition, visual understanding, and key feature extraction in underwater scenes. In this paper, we propose an underwater image enhancement convolutional neural network (CNN) model based on underwater scene prior, called UWCNN. Instead of estimating the parameters of underwater imaging model, the proposed UWCNN model directly reconstructs the clear latent underwater image, which benefits from the underwater scene prior which can be used to synthesize underwater image training data. Besides, based on the light-weight network structure and effective training data, our UWCNN model can be easily extended to underwater videos for frame-by-frame enhancement. Specifically, combining an underwater imaging physical model with optical properties of underwater scenes, we first synthesize underwater image degradation datasets which cover a diverse set of water types and degradation levels. Then, a light-weight CNN model is designed for enhancing each underwater scene type, which is trained by the corresponding training data. At last, this UWCNN model is directly extended to underwater video enhancement. Experiments on real-world and synthetic underwater images and videos demonstrate that our method generalizes well to different underwater scenes.
ArticleNumber 107038
Author Li, Chongyi
Anwar, Saeed
Porikli, Fatih
Author_xml – sequence: 1
  givenname: Chongyi
  orcidid: 0000-0003-2609-2460
  surname: Li
  fullname: Li, Chongyi
  email: lichongyi@tju.edu.cn
  organization: Department of Computer Science, City University of Hong Kong (CityU), Hong Kong
– sequence: 2
  givenname: Saeed
  surname: Anwar
  fullname: Anwar, Saeed
  organization: Data61, CSIRO, ACT 2601, Australia
– sequence: 3
  givenname: Fatih
  surname: Porikli
  fullname: Porikli, Fatih
  organization: Research School of Engineering, The Australian National University, Canberra, ACT 0200, Australia
BookMark eNqFkE1LAzEQhoNUsK3-Aw_5A1vzsd3NehCk-IUFL_YcpslsTWmzSxIr_ntTVhA86GlgZt4XnmdCRr7zSMglZzPOeHW1nfWQTLeZCcabvKqZVCdkzFUtizkvxYiMGZO8kILJMzKJccsYr_NhTJ5X3mL4gISBRoMeaR9cF6jzsXcBLbWIPX3_eXJ72CAFb-nBWewo-jfwBvfo0zk5bWEX8eJ7Tsnq_u518VgsXx6eFrfLwkhWpQJKVc3rqjaqZaWESgFXaIWVpoayWfN1KyUYqFA0KtMpyxEaBYKBMIxVXE7J9dBrQhdjwFYblyC5zqcAbqc500cteqsHLfqoRQ9acrj8Fc7Aewif_8VuhhhmsIPDoKNxmMFttmSStp37u-ALAsGBOw
CitedBy_id crossref_primary_10_18287_2412_6179_CO_754
crossref_primary_10_1109_TCSVT_2024_3497594
crossref_primary_10_1016_j_engstruct_2024_119037
crossref_primary_10_1016_j_jvcir_2024_104355
crossref_primary_10_1109_LGRS_2023_3299613
crossref_primary_10_1007_s13042_023_01984_6
crossref_primary_10_1109_TGRS_2022_3223083
crossref_primary_10_1016_j_compeleceng_2022_108029
crossref_primary_10_1016_j_optcom_2023_130064
crossref_primary_10_1109_ACCESS_2025_3536173
crossref_primary_10_1364_OE_499684
crossref_primary_10_1007_s11760_023_02591_2
crossref_primary_10_1016_j_engappai_2023_107219
crossref_primary_10_3390_jmse11020447
crossref_primary_10_1016_j_engappai_2023_107462
crossref_primary_10_1109_TIP_2022_3190209
crossref_primary_10_1007_s10846_024_02065_8
crossref_primary_10_1109_TCSVT_2022_3174817
crossref_primary_10_1109_TIM_2024_3366583
crossref_primary_10_1109_TIP_2022_3177129
crossref_primary_10_3390_jmse12071217
crossref_primary_10_1016_j_asoc_2022_109973
crossref_primary_10_3390_jmse12071216
crossref_primary_10_1016_j_eswa_2024_125350
crossref_primary_10_1016_j_knosys_2024_112049
crossref_primary_10_1016_j_knosys_2024_112048
crossref_primary_10_1109_TGRS_2024_3485030
crossref_primary_10_1016_j_image_2021_116408
crossref_primary_10_1016_j_jvcir_2024_104240
crossref_primary_10_1109_TIM_2023_3307754
crossref_primary_10_1016_j_eswa_2025_126616
crossref_primary_10_1631_FITEE_2000190
crossref_primary_10_1049_ipr2_12061
crossref_primary_10_1007_s40009_022_01157_z
crossref_primary_10_1007_s00530_024_01432_7
crossref_primary_10_1364_AO_424917
crossref_primary_10_1016_j_image_2024_117200
crossref_primary_10_3390_electronics12112352
crossref_primary_10_1109_TCSVT_2024_3465875
crossref_primary_10_1016_j_optlastec_2024_112375
crossref_primary_10_1109_TIM_2022_3212986
crossref_primary_10_1049_iet_ipr_2019_1677
crossref_primary_10_1109_TCSVT_2022_3190057
crossref_primary_10_5753_reic_2024_4671
crossref_primary_10_1109_LSP_2024_3470752
crossref_primary_10_3390_rs14174297
crossref_primary_10_1016_j_jvcir_2024_104131
crossref_primary_10_1007_s11760_022_02322_z
crossref_primary_10_1007_s12541_023_00771_1
crossref_primary_10_1117_1_JEI_31_5_053012
crossref_primary_10_1007_s11042_022_12135_4
crossref_primary_10_1109_TIP_2023_3309408
crossref_primary_10_1049_ipr2_13161
crossref_primary_10_1109_TGRS_2023_3293912
crossref_primary_10_1109_JOE_2023_3245686
crossref_primary_10_1109_JOE_2022_3152519
crossref_primary_10_1007_s11760_024_03004_8
crossref_primary_10_1016_j_displa_2022_102359
crossref_primary_10_1109_JOE_2023_3310079
crossref_primary_10_1016_j_eswa_2023_120842
crossref_primary_10_1016_j_image_2023_117032
crossref_primary_10_1016_j_neucom_2024_128286
crossref_primary_10_1007_s00371_021_02305_0
crossref_primary_10_1007_s11042_022_12267_7
crossref_primary_10_1109_TIM_2024_3480228
crossref_primary_10_1364_OE_432756
crossref_primary_10_1145_3688802
crossref_primary_10_1016_j_jvcir_2024_104386
crossref_primary_10_1109_TCSVT_2023_3289566
crossref_primary_10_1007_s11042_023_15419_5
crossref_primary_10_1109_TIM_2022_3189630
crossref_primary_10_1109_TMM_2024_3387760
crossref_primary_10_1155_2022_2254077
crossref_primary_10_1016_j_cag_2022_08_003
crossref_primary_10_1007_s11760_023_02985_2
crossref_primary_10_1007_s13042_024_02379_x
crossref_primary_10_1016_j_knosys_2022_109751
crossref_primary_10_3390_s21093289
crossref_primary_10_3390_jmse12112080
crossref_primary_10_1109_JOE_2022_3192047
crossref_primary_10_1016_j_eswa_2023_120856
crossref_primary_10_1016_j_fmre_2021_03_002
crossref_primary_10_1007_s11042_022_14228_6
crossref_primary_10_1016_j_imavis_2024_104995
crossref_primary_10_3233_JIFS_211680
crossref_primary_10_1016_j_asoc_2024_112308
crossref_primary_10_1364_JOSAA_502703
crossref_primary_10_1007_s00138_023_01478_z
crossref_primary_10_1109_ACCESS_2020_3019767
crossref_primary_10_1364_OE_509344
crossref_primary_10_3390_s24227411
crossref_primary_10_1016_j_neucom_2024_129270
crossref_primary_10_3390_electronics13010196
crossref_primary_10_1364_OE_523951
crossref_primary_10_1016_j_compag_2021_106182
crossref_primary_10_1016_j_ijleo_2023_170926
crossref_primary_10_1016_j_displa_2022_102174
crossref_primary_10_1007_s11042_023_16995_2
crossref_primary_10_3390_electronics13010199
crossref_primary_10_3390_sym14030558
crossref_primary_10_1007_s10489_023_04502_x
crossref_primary_10_1109_TGRS_2023_3338611
crossref_primary_10_1109_TGRS_2021_3110575
crossref_primary_10_1016_j_eswa_2024_125271
crossref_primary_10_1109_ACCESS_2024_3478788
crossref_primary_10_3390_math12131933
crossref_primary_10_1016_j_neucom_2024_129274
crossref_primary_10_1007_s13042_022_01659_8
crossref_primary_10_1007_s44295_023_00015_y
crossref_primary_10_1016_j_jvcir_2024_104166
crossref_primary_10_1016_j_jvcir_2024_104163
crossref_primary_10_1016_j_engappai_2023_106457
crossref_primary_10_1016_j_patcog_2025_111411
crossref_primary_10_1109_TCSVT_2022_3164918
crossref_primary_10_1109_LSP_2023_3255005
crossref_primary_10_3390_app12115420
crossref_primary_10_3390_app10186392
crossref_primary_10_1007_s11042_023_14871_7
crossref_primary_10_1016_j_displa_2023_102586
crossref_primary_10_1007_s11760_023_02864_w
crossref_primary_10_1007_s11554_023_01387_4
crossref_primary_10_1016_j_jvcir_2024_104051
crossref_primary_10_1016_j_jvcir_2024_104052
crossref_primary_10_1109_ACCESS_2024_3524593
crossref_primary_10_1007_s11227_024_06719_0
crossref_primary_10_1109_JSEN_2023_3313108
crossref_primary_10_1109_TIP_2022_3196815
crossref_primary_10_3390_app12104898
crossref_primary_10_1109_LRA_2021_3070253
crossref_primary_10_1109_TCSVT_2023_3299314
crossref_primary_10_1117_1_JEI_32_4_043025
crossref_primary_10_1145_3489520
crossref_primary_10_1016_j_inffus_2024_102809
crossref_primary_10_1088_1361_6501_ac16ef
crossref_primary_10_1109_JOE_2023_3245760
crossref_primary_10_1016_j_oceaneng_2024_118367
crossref_primary_10_1109_ACCESS_2024_3400533
crossref_primary_10_1016_j_image_2024_117154
crossref_primary_10_1016_j_aej_2024_01_009
crossref_primary_10_1109_TCI_2024_3420881
crossref_primary_10_1016_j_inffus_2024_102927
crossref_primary_10_1016_j_image_2022_116805
crossref_primary_10_1007_s00371_024_03630_w
crossref_primary_10_1016_j_jvcir_2024_104065
crossref_primary_10_1364_AO_433558
crossref_primary_10_1007_s00371_024_03421_3
crossref_primary_10_3389_fmars_2025_1555128
crossref_primary_10_3390_jimaging9100214
crossref_primary_10_1016_j_engappai_2023_107766
crossref_primary_10_1016_j_image_2021_116622
crossref_primary_10_1117_1_JEI_33_1_013007
crossref_primary_10_1016_j_apm_2023_02_004
crossref_primary_10_1109_TCSVT_2023_3328272
crossref_primary_10_1007_s00371_024_03785_6
crossref_primary_10_1007_s11704_022_1205_7
crossref_primary_10_1080_1206212X_2025_2466197
crossref_primary_10_1007_s12145_024_01620_z
crossref_primary_10_1016_j_eswa_2023_122710
crossref_primary_10_1016_j_neucom_2020_07_130
crossref_primary_10_1088_1361_6501_acb72d
crossref_primary_10_1007_s12145_024_01226_5
crossref_primary_10_1049_ipr2_12210
crossref_primary_10_1007_s00371_024_03377_4
crossref_primary_10_3390_jmse13020231
crossref_primary_10_1016_j_cag_2023_01_009
crossref_primary_10_1109_JOE_2021_3064093
crossref_primary_10_1109_JOE_2022_3223733
crossref_primary_10_1016_j_eswa_2025_126561
crossref_primary_10_1109_TGRS_2024_3477911
crossref_primary_10_1016_j_engappai_2023_106532
crossref_primary_10_1002_rob_22525
crossref_primary_10_1364_AO_431712
crossref_primary_10_1049_ipr2_12341
crossref_primary_10_3390_app15020641
crossref_primary_10_1007_s00371_025_03866_0
crossref_primary_10_1016_j_image_2021_116248
crossref_primary_10_1145_3578584
crossref_primary_10_1109_JSEN_2023_3251326
crossref_primary_10_1109_TCSVT_2024_3378252
crossref_primary_10_3390_s22249600
crossref_primary_10_1007_s11227_022_04720_z
crossref_primary_10_1364_AO_452318
crossref_primary_10_1109_TCYB_2024_3482174
crossref_primary_10_1364_OE_453387
crossref_primary_10_1088_1361_6501_ad4dca
crossref_primary_10_1002_col_22728
crossref_primary_10_3390_electronics13224451
crossref_primary_10_1109_LGRS_2020_3023805
crossref_primary_10_3390_electronics11182894
crossref_primary_10_1016_j_patcog_2022_109041
crossref_primary_10_3390_w16050626
crossref_primary_10_1117_1_JEI_33_2_023024
crossref_primary_10_1016_j_engappai_2023_106743
crossref_primary_10_1016_j_neucom_2024_127909
crossref_primary_10_1109_TETCI_2024_3369321
crossref_primary_10_1145_3390462
crossref_primary_10_1364_OE_468633
crossref_primary_10_1016_j_image_2022_116855
crossref_primary_10_1364_AO_420962
crossref_primary_10_1016_j_compeleceng_2021_106981
crossref_primary_10_1016_j_knosys_2022_110041
crossref_primary_10_3390_app13179956
crossref_primary_10_1002_int_22806
crossref_primary_10_1007_s00371_023_03215_z
crossref_primary_10_1016_j_engappai_2022_105489
crossref_primary_10_1016_j_image_2022_116848
crossref_primary_10_1109_TIM_2023_3295026
crossref_primary_10_1109_TMM_2024_3371218
crossref_primary_10_1007_s10489_024_06224_0
crossref_primary_10_3390_s22093550
crossref_primary_10_1016_j_compeleceng_2022_107909
crossref_primary_10_1016_j_image_2020_116088
crossref_primary_10_3390_app13169419
crossref_primary_10_1145_3709003
crossref_primary_10_1016_j_birob_2024_100169
crossref_primary_10_1016_j_dsp_2025_105048
crossref_primary_10_1016_j_isprsjprs_2024_06_009
crossref_primary_10_1109_TIM_2024_3378290
crossref_primary_10_1109_TBC_2022_3227424
crossref_primary_10_3390_s24092684
crossref_primary_10_3389_fmars_2023_1161399
crossref_primary_10_1080_13682199_2024_2439731
crossref_primary_10_1016_j_compag_2021_106585
crossref_primary_10_1109_TCSVT_2024_3451553
crossref_primary_10_1016_j_cviu_2024_104225
crossref_primary_10_46604_aiti_2022_8938
crossref_primary_10_1016_j_image_2022_116797
crossref_primary_10_1364_AO_452345
crossref_primary_10_1109_ACCESS_2023_3291449
crossref_primary_10_1016_j_neucom_2024_127926
crossref_primary_10_1007_s10489_024_05538_3
crossref_primary_10_1016_j_physleta_2024_130001
crossref_primary_10_1016_j_ndteint_2024_103198
crossref_primary_10_1016_j_dsp_2022_103660
crossref_primary_10_1142_S0219467823500316
crossref_primary_10_1016_j_engappai_2023_106731
crossref_primary_10_1016_j_engappai_2023_106972
crossref_primary_10_1007_s11263_024_02065_z
crossref_primary_10_1109_TCE_2024_3476033
crossref_primary_10_1109_JOE_2024_3458348
crossref_primary_10_1109_MRA_2024_3351487
crossref_primary_10_1109_TCSVT_2023_3297524
crossref_primary_10_1016_j_aei_2024_102723
crossref_primary_10_3390_s21093268
crossref_primary_10_1109_ACCESS_2020_3006359
crossref_primary_10_1007_s11760_023_02695_9
crossref_primary_10_1109_TGRS_2024_3358892
crossref_primary_10_1007_s11431_023_2614_8
crossref_primary_10_1016_j_image_2020_116030
crossref_primary_10_1109_TCSVT_2023_3253898
crossref_primary_10_1016_j_image_2022_116670
crossref_primary_10_1109_TCSVT_2024_3466925
crossref_primary_10_1364_OL_457964
crossref_primary_10_1016_j_eswa_2024_126075
crossref_primary_10_1109_TCSVT_2024_3417000
crossref_primary_10_1007_s00371_022_02580_5
crossref_primary_10_1016_j_eswa_2023_122693
crossref_primary_10_1016_j_image_2023_116939
crossref_primary_10_1117_1_OE_62_11_113108
crossref_primary_10_1016_j_patcog_2024_110435
crossref_primary_10_1109_LRA_2022_3156176
crossref_primary_10_1109_TIM_2024_3396850
crossref_primary_10_1002_cav_2098
crossref_primary_10_3390_electronics12051227
crossref_primary_10_1142_S0219467823500444
crossref_primary_10_1016_j_optcom_2025_131695
crossref_primary_10_1016_j_imavis_2024_105256
crossref_primary_10_32604_csse_2023_027187
crossref_primary_10_1109_ACCESS_2024_3404613
crossref_primary_10_1109_JOE_2024_3474919
crossref_primary_10_32604_cmes_2022_019447
crossref_primary_10_53433_yyufbed_1249102
crossref_primary_10_1007_s10462_024_10960_7
crossref_primary_10_1016_j_engappai_2021_104171
crossref_primary_10_1007_s11760_023_02718_5
crossref_primary_10_1007_s00138_024_01651_y
crossref_primary_10_1109_TIP_2022_3196546
crossref_primary_10_3390_math12223553
crossref_primary_10_1016_j_compag_2021_106497
crossref_primary_10_1109_ACCESS_2024_3435569
crossref_primary_10_1109_TCSVT_2020_3035108
crossref_primary_10_1016_j_jvcir_2024_104224
crossref_primary_10_1364_OE_538120
crossref_primary_10_1016_j_jvcir_2024_104101
crossref_primary_10_1109_ACCESS_2020_2997408
crossref_primary_10_1109_TETCI_2023_3322424
crossref_primary_10_3390_jmse12020358
crossref_primary_10_1016_j_sigpro_2024_109408
crossref_primary_10_1007_s11042_023_14594_9
crossref_primary_10_1016_j_image_2021_116225
crossref_primary_10_1016_j_cviu_2024_104251
crossref_primary_10_1109_JOE_2022_3140563
crossref_primary_10_1007_s11760_021_01969_4
crossref_primary_10_1016_j_image_2025_117281
crossref_primary_10_1016_j_imavis_2024_105361
crossref_primary_10_1109_TCSVT_2022_3225376
crossref_primary_10_1371_journal_pone_0299110
crossref_primary_10_1016_j_optlastec_2024_111957
crossref_primary_10_1007_s10334_023_01127_6
crossref_primary_10_1002_rob_22378
crossref_primary_10_1016_j_engappai_2023_105952
crossref_primary_10_1109_TMM_2022_3152390
crossref_primary_10_1007_s11263_024_01987_y
crossref_primary_10_1080_09500340_2024_2362893
crossref_primary_10_1371_journal_pone_0275107
crossref_primary_10_1371_journal_pone_0279945
crossref_primary_10_1109_OJCS_2024_3492698
crossref_primary_10_1109_TIP_2021_3076367
crossref_primary_10_1177_14759217241228780
crossref_primary_10_1016_j_oceaneng_2025_120389
crossref_primary_10_1109_TIM_2023_3298395
crossref_primary_10_3390_s23135774
crossref_primary_10_1049_ipr2_12845
crossref_primary_10_1007_s11042_024_20091_4
crossref_primary_10_1109_TCSVT_2023_3328785
crossref_primary_10_1109_TCSVT_2022_3208100
crossref_primary_10_3390_electronics14061203
crossref_primary_10_3390_rs16203773
crossref_primary_10_1016_j_compag_2022_107186
crossref_primary_10_1016_j_autcon_2025_106102
crossref_primary_10_1007_s11431_024_2824_x
crossref_primary_10_1016_j_imavis_2024_105389
crossref_primary_10_1109_JOE_2024_3463840
crossref_primary_10_1007_s11042_020_09460_x
crossref_primary_10_1109_TGRS_2023_3339216
crossref_primary_10_3390_s23031212
crossref_primary_10_1364_OE_427839
crossref_primary_10_1364_OPTICA_494868
crossref_primary_10_1007_s11042_023_14470_6
crossref_primary_10_1016_j_inffus_2025_102977
crossref_primary_10_3390_jmse10020241
crossref_primary_10_1109_LRA_2024_3426382
crossref_primary_10_1109_TCSVT_2023_3237993
crossref_primary_10_1007_s12596_023_01344_1
crossref_primary_10_3389_fmars_2022_964600
crossref_primary_10_1016_j_ecoinf_2024_102631
crossref_primary_10_1007_s13042_023_01959_7
crossref_primary_10_1109_ACCESS_2020_3002883
crossref_primary_10_3389_fmars_2023_1132500
crossref_primary_10_3390_jmse13010127
crossref_primary_10_1016_j_sigpro_2024_109690
crossref_primary_10_1016_j_asoc_2023_110698
crossref_primary_10_1109_JOE_2024_3463838
crossref_primary_10_1109_ACCESS_2020_3009161
crossref_primary_10_1109_TGRS_2020_3033407
crossref_primary_10_3389_fmars_2024_1378817
crossref_primary_10_1049_ipr2_12745
crossref_primary_10_1109_JOE_2024_3428624
crossref_primary_10_1109_TCSVT_2024_3482548
crossref_primary_10_1109_LSP_2021_3072563
crossref_primary_10_1109_ACCESS_2022_3201555
crossref_primary_10_1109_LSP_2021_3099746
crossref_primary_10_1016_j_knosys_2024_111977
crossref_primary_10_1016_j_patcog_2023_110222
crossref_primary_10_1007_s11042_024_18686_y
crossref_primary_10_1016_j_optlaseng_2024_108590
crossref_primary_10_1016_j_patcog_2024_110529
crossref_primary_10_1016_j_engappai_2024_109006
crossref_primary_10_1016_j_image_2022_116684
crossref_primary_10_1007_s00530_023_01224_5
crossref_primary_10_1016_j_engappai_2023_105946
crossref_primary_10_1016_j_image_2020_115952
crossref_primary_10_1016_j_sigpro_2024_109569
crossref_primary_10_1007_s11042_021_11269_1
crossref_primary_10_1109_TCSVT_2023_3305777
crossref_primary_10_1364_OE_494638
crossref_primary_10_1016_j_neucom_2023_02_018
crossref_primary_10_11834_jig_230323
crossref_primary_10_3390_biomimetics8030275
crossref_primary_10_1016_j_engappai_2024_109437
crossref_primary_10_3390_info13040187
crossref_primary_10_3390_s24185893
crossref_primary_10_1364_OE_482489
crossref_primary_10_3788_LOP223047
crossref_primary_10_1088_1755_1315_809_1_012012
crossref_primary_10_1109_TCSVT_2024_3412748
crossref_primary_10_3390_fractalfract7010070
crossref_primary_10_1109_TGRS_2023_3340244
crossref_primary_10_1117_1_JEI_33_5_053053
crossref_primary_10_1109_TCYB_2020_2969255
crossref_primary_10_1007_s42979_024_02847_9
crossref_primary_10_1088_1361_6501_abaa1d
crossref_primary_10_1364_OE_525348
crossref_primary_10_1109_JOE_2022_3190517
crossref_primary_10_1109_TIM_2024_3417543
crossref_primary_10_1109_TGRS_2024_3363037
crossref_primary_10_1016_j_engappai_2024_108585
crossref_primary_10_1016_j_patcog_2022_108822
crossref_primary_10_1109_ACCESS_2023_3240648
crossref_primary_10_1016_j_optlaseng_2020_106115
crossref_primary_10_1117_1_JEI_33_6_063018
crossref_primary_10_1109_JOE_2023_3275615
crossref_primary_10_3390_rs17050759
crossref_primary_10_7717_peerj_cs_2392
crossref_primary_10_1007_s00138_025_01662_3
crossref_primary_10_1007_s11042_023_15153_y
crossref_primary_10_1016_j_image_2020_115921
crossref_primary_10_1016_j_inffus_2024_102494
crossref_primary_10_1007_s11042_023_15652_y
crossref_primary_10_1007_s10489_022_03275_z
crossref_primary_10_3389_fmars_2023_1138013
crossref_primary_10_1109_JOE_2021_3104055
crossref_primary_10_3389_fmars_2023_1242041
crossref_primary_10_1007_s00530_023_01246_z
crossref_primary_10_1016_j_isprsjprs_2022_12_007
crossref_primary_10_1016_j_patcog_2022_108716
crossref_primary_10_1109_JOE_2023_3297731
crossref_primary_10_1109_TGRS_2024_3353371
crossref_primary_10_3390_s24061937
crossref_primary_10_1049_ipr2_12544
crossref_primary_10_1155_2022_8330985
crossref_primary_10_1109_TGRS_2024_3358828
crossref_primary_10_1364_OE_462861
crossref_primary_10_1109_JAS_2023_123771
crossref_primary_10_1007_s11042_023_16967_6
crossref_primary_10_1364_OE_512397
crossref_primary_10_1016_j_engappai_2022_104759
crossref_primary_10_1016_j_engappai_2023_105905
crossref_primary_10_1007_s11760_025_03829_x
crossref_primary_10_1109_LSP_2020_3048619
crossref_primary_10_1364_OE_428626
crossref_primary_10_1016_j_patrec_2022_12_019
crossref_primary_10_1016_j_knosys_2024_111786
crossref_primary_10_1109_ACCESS_2019_2959560
crossref_primary_10_3390_jmse12101790
crossref_primary_10_1109_JSTARS_2023_3344453
crossref_primary_10_1007_s11042_023_15542_3
crossref_primary_10_3390_jmse12071055
crossref_primary_10_3390_jmse12071175
crossref_primary_10_1109_TIP_2023_3276332
crossref_primary_10_1117_1_JEI_31_6_063039
crossref_primary_10_1016_j_neunet_2024_106809
crossref_primary_10_1109_TCSVT_2023_3290363
crossref_primary_10_3390_jmse9060570
crossref_primary_10_1016_j_isprsjprs_2023_01_007
crossref_primary_10_1109_TGRS_2023_3315772
crossref_primary_10_1142_S021800142454003X
crossref_primary_10_1016_j_jvcir_2021_103136
crossref_primary_10_3390_drones8090486
crossref_primary_10_1007_s00371_022_02510_5
crossref_primary_10_1007_s12145_024_01279_6
crossref_primary_10_1080_09500340_2022_2090628
crossref_primary_10_1109_JOE_2024_3429653
crossref_primary_10_1007_s12559_023_10197_6
crossref_primary_10_3390_jmse12091467
crossref_primary_10_1109_TIP_2023_3286263
crossref_primary_10_1109_TIP_2023_3334556
crossref_primary_10_1109_TCSVT_2021_3074197
crossref_primary_10_1016_j_patcog_2023_109775
crossref_primary_10_1109_ACCESS_2024_3474031
crossref_primary_10_1016_j_patcog_2023_109774
crossref_primary_10_3390_jmse11030662
crossref_primary_10_1016_j_engappai_2023_107069
crossref_primary_10_3390_jmse12091472
crossref_primary_10_1016_j_patcog_2022_108983
crossref_primary_10_1016_j_image_2020_115892
crossref_primary_10_1016_j_patcog_2021_108324
crossref_primary_10_1007_s12145_024_01573_3
crossref_primary_10_1016_j_neucom_2022_07_036
crossref_primary_10_1007_s44295_024_00021_8
crossref_primary_10_1016_j_aej_2024_08_047
crossref_primary_10_1364_OE_432900
crossref_primary_10_3390_app14114395
crossref_primary_10_3390_s24020356
crossref_primary_10_1364_OE_463865
crossref_primary_10_3390_s23042169
crossref_primary_10_3390_s22062168
crossref_primary_10_1016_j_eswa_2024_125539
crossref_primary_10_1109_TMM_2024_3521739
crossref_primary_10_1007_s11760_024_03047_x
crossref_primary_10_1109_TCYB_2024_3365693
crossref_primary_10_1109_TIP_2023_3244647
crossref_primary_10_1007_s12204_024_2735_y
crossref_primary_10_1109_TGRS_2024_3422667
crossref_primary_10_1109_THMS_2023_3261341
crossref_primary_10_1049_ipr2_12901
crossref_primary_10_1007_s11042_023_17180_1
crossref_primary_10_1016_j_engappai_2023_106196
crossref_primary_10_1007_s11554_020_01052_0
crossref_primary_10_1364_AO_431299
crossref_primary_10_1007_s11042_023_15708_z
crossref_primary_10_1109_LSP_2023_3281255
crossref_primary_10_1007_s11263_023_01853_3
crossref_primary_10_1016_j_patcog_2021_108342
crossref_primary_10_3390_jmse10091289
crossref_primary_10_1109_ACCESS_2020_3040505
crossref_primary_10_1109_ACCESS_2023_3339817
crossref_primary_10_1016_j_neucom_2020_05_108
crossref_primary_10_1016_j_autcon_2020_103424
crossref_primary_10_1016_j_engappai_2024_108561
crossref_primary_10_1109_TCSVT_2022_3212788
crossref_primary_10_1109_ACCESS_2024_3465550
crossref_primary_10_1016_j_autcon_2024_105727
crossref_primary_10_1109_JSEN_2024_3488495
crossref_primary_10_1007_s11042_023_14687_5
crossref_primary_10_1007_s11760_024_03080_w
crossref_primary_10_1109_LGRS_2019_2950056
crossref_primary_10_1155_2022_3903453
crossref_primary_10_3390_electronics12234882
crossref_primary_10_1016_j_patcog_2025_111388
crossref_primary_10_1016_j_jvcir_2023_103863
crossref_primary_10_1016_j_displa_2025_103023
crossref_primary_10_1109_TMM_2024_3374598
crossref_primary_10_1016_j_dib_2024_110723
crossref_primary_10_1016_j_displa_2022_102337
crossref_primary_10_1109_TGRS_2022_3205061
crossref_primary_10_1016_j_neucom_2021_07_003
crossref_primary_10_1016_j_neucom_2025_129553
crossref_primary_10_1016_j_neunet_2023_11_008
crossref_primary_10_1109_LSP_2023_3338055
crossref_primary_10_1109_TCSVT_2024_3455353
crossref_primary_10_1007_s11042_020_10273_1
crossref_primary_10_1007_s11760_024_03343_6
crossref_primary_10_1016_j_image_2020_115978
crossref_primary_10_1371_journal_pone_0272666
crossref_primary_10_1007_s10489_020_01931_w
crossref_primary_10_1016_j_eswa_2024_125549
crossref_primary_10_1007_s11063_023_11407_w
crossref_primary_10_1109_ACCESS_2024_3449136
crossref_primary_10_3390_jmse11040787
crossref_primary_10_1016_j_jvcir_2023_103979
crossref_primary_10_1007_s11042_020_10426_2
crossref_primary_10_1007_s00138_024_01647_8
crossref_primary_10_1109_ACCESS_2024_3436073
crossref_primary_10_3390_jmse11061221
crossref_primary_10_1016_j_displa_2023_102505
crossref_primary_10_1145_3624983
crossref_primary_10_1016_j_inffus_2024_102770
crossref_primary_10_1109_ACCESS_2020_3034275
crossref_primary_10_1007_s11045_021_00795_8
crossref_primary_10_1109_JOE_2023_3302888
crossref_primary_10_1109_TMM_2021_3115442
crossref_primary_10_1109_LSP_2022_3232035
crossref_primary_10_1109_TCSVT_2021_3115791
crossref_primary_10_1117_1_JEI_31_6_063017
crossref_primary_10_1016_j_optlastec_2025_112545
crossref_primary_10_1177_17298806231199845
crossref_primary_10_3390_rs15010039
crossref_primary_10_1364_AO_456368
crossref_primary_10_1016_j_asoc_2024_112000
crossref_primary_10_1109_TIP_2019_2955241
crossref_primary_10_3390_electronics13142817
crossref_primary_10_1016_j_engappai_2024_108411
crossref_primary_10_1049_ipr2_12702
crossref_primary_10_1016_j_optlaseng_2025_108865
crossref_primary_10_1109_JOE_2023_3317903
crossref_primary_10_1080_13682199_2023_2239012
crossref_primary_10_1142_S0218213023500501
crossref_primary_10_1007_s00371_024_03611_z
crossref_primary_10_3788_LOP222442
crossref_primary_10_1109_LSP_2020_3042126
crossref_primary_10_1016_j_engappai_2024_108884
crossref_primary_10_1016_j_jvcir_2022_103587
crossref_primary_10_1007_s00138_023_01384_4
crossref_primary_10_3390_jmse12081383
crossref_primary_10_1109_TIP_2022_3216208
crossref_primary_10_1109_TPAMI_2022_3226276
crossref_primary_10_3390_jmse11030604
Cites_doi 10.1109/TPAMI.2018.2819173
10.1109/TIP.2017.2787612
10.1109/TIP.2016.2612882
10.1109/TIP.2011.2179666
10.1109/JOE.2015.2469915
10.1016/j.patcog.2018.08.018
10.1109/LSP.2018.2792050
10.1016/j.patcog.2006.05.036
10.1016/j.patcog.2017.10.013
10.1109/TIP.2018.2887029
10.1016/j.patcog.2019.01.006
10.1016/j.jvcir.2014.11.006
10.1016/j.patcog.2016.06.008
10.1016/j.patcog.2018.08.015
10.1016/j.patcog.2016.07.026
10.1109/TPAMI.2010.168
10.1109/TIP.2017.2663846
10.1109/TIP.2015.2491020
10.1109/TIP.2017.2759252
10.1016/j.patcog.2010.02.007
10.1117/1.JEI.24.3.033023
10.1109/TIP.2003.819861
10.1016/j.patrec.2017.05.023
10.1109/MCG.2016.26
ContentType Journal Article
Copyright 2019 Elsevier Ltd
Copyright_xml – notice: 2019 Elsevier Ltd
DBID AAYXX
CITATION
DOI 10.1016/j.patcog.2019.107038
DatabaseName CrossRef
DatabaseTitle CrossRef
DatabaseTitleList
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 1873-5142
ExternalDocumentID 10_1016_j_patcog_2019_107038
S0031320319303401
GroupedDBID --K
--M
-D8
-DT
-~X
.DC
.~1
0R~
123
1B1
1RT
1~.
1~5
29O
4.4
457
4G.
53G
5VS
7-5
71M
8P~
9JN
AABNK
AACTN
AAEDT
AAEDW
AAIAV
AAIKJ
AAKOC
AALRI
AAOAW
AAQFI
AAQXK
AAXUO
AAYFN
ABBOA
ABEFU
ABFNM
ABFRF
ABHFT
ABJNI
ABMAC
ABTAH
ABXDB
ABYKQ
ACBEA
ACDAQ
ACGFO
ACGFS
ACNNM
ACRLP
ACZNC
ADBBV
ADEZE
ADJOM
ADMUD
ADMXK
ADTZH
AEBSH
AECPX
AEFWE
AEKER
AENEX
AFKWA
AFTJW
AGHFR
AGUBO
AGYEJ
AHHHB
AHJVU
AHZHX
AIALX
AIEXJ
AIKHN
AITUG
AJBFU
AJOXV
ALMA_UNASSIGNED_HOLDINGS
AMFUW
AMRAJ
AOUOD
ASPBG
AVWKF
AXJTR
AZFZN
BJAXD
BKOJK
BLXMC
CS3
DU5
EBS
EFJIC
EFLBG
EJD
EO8
EO9
EP2
EP3
F0J
F5P
FD6
FDB
FEDTE
FGOYB
FIRID
FNPLU
FYGXN
G-Q
G8K
GBLVA
GBOLZ
HLZ
HVGLF
HZ~
H~9
IHE
J1W
JJJVA
KOM
KZ1
LG9
LMP
LY1
M41
MO0
N9A
O-L
O9-
OAUVE
OZT
P-8
P-9
P2P
PC.
Q38
R2-
RIG
RNS
ROL
RPZ
SBC
SDF
SDG
SDP
SDS
SES
SEW
SPC
SPCBC
SST
SSV
SSZ
T5K
TN5
UNMZH
VOH
WUQ
XJE
XPP
ZMT
ZY4
~G-
AATTM
AAXKI
AAYWO
AAYXX
ABDPE
ABWVN
ACRPL
ACVFH
ADCNI
ADNMO
AEIPS
AEUPX
AFJKZ
AFPUW
AFXIZ
AGCQF
AGQPQ
AGRNS
AIGII
AIIUN
AKBMS
AKRWK
AKYEP
ANKPU
APXCP
BNPGV
CITATION
SSH
ID FETCH-LOGICAL-c306t-a4865767c8f043a68a18ed2d3c7a49b1bf33aca6e2980168d1ea98a20a2c00613
IEDL.DBID .~1
ISSN 0031-3203
IngestDate Tue Jul 01 02:36:30 EDT 2025
Thu Apr 24 23:00:08 EDT 2025
Fri Feb 23 02:25:25 EST 2024
IsPeerReviewed true
IsScholarly true
Keywords Deep learning
Underwater image synthesis
Pattern recognition
Underwater image and video enhancement and restoration
Language English
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c306t-a4865767c8f043a68a18ed2d3c7a49b1bf33aca6e2980168d1ea98a20a2c00613
ORCID 0000-0003-2609-2460
ParticipantIDs crossref_citationtrail_10_1016_j_patcog_2019_107038
crossref_primary_10_1016_j_patcog_2019_107038
elsevier_sciencedirect_doi_10_1016_j_patcog_2019_107038
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate February 2020
2020-02-00
PublicationDateYYYYMMDD 2020-02-01
PublicationDate_xml – month: 02
  year: 2020
  text: February 2020
PublicationDecade 2020
PublicationTitle Pattern recognition
PublicationYear 2020
Publisher Elsevier Ltd
Publisher_xml – name: Elsevier Ltd
References Lore, Akintayo, Sarkar (bib0003) 2017; 61
Ancuti, Ancuti (bib0014) 2012
Li, Guo, Guo, Cong, Gong (bib0015) 2017; 94
Wang, Xu, Shen, Zhu (bib0004) 2018
Yang, Yan, Lu, Jia, Xie, Gao (bib0005) 2019; 86
Wang, Guo, Cheng, Borji (bib0034) 2018
Zhou, Yuan (bib0035) 2019; 86
Li, Cong, Hou, Zhang, Qian, Kwong (bib0036) 2019
Li, Guo (bib0013) 2015; 24
Li, Guo, Chen, Tang, Pang, Wang (bib0024) 2016
Huang, He, Fan, Zhang (bib0031) 2010; 43
Galdran, Pardo, Picn, Alvarez-Gila (bib0021) 2015; 26
Yang, Sowmya (bib0039) 2015; 24
Shen, Xu, Kautz, Yang (bib0030) 2018
Li, Guo, Guo, Han, Fu, Cong (bib0029) 2019
Berman, Treibitz, Avidan (bib0009) 2017
Guo, Li, Guo, Cong, Fu, Han (bib0007) 2019; 28
Li, Skinner, Eustice, Roberson (bib0027) 2017; 3
Sheinin, Schechner (bib0028) 2017
Guo, Li, Zhuang (bib0017) 2019
Wang, Shen (bib0010) 2018; 27
Gu, Wang, Kuen, Ma, Shahroudy, Shuai, Liu, Wang, Cai, Chen (bib0012) 2018; 77
Peng, Cosman (bib0026) 2017; 26
Lopes, de Aguiar, Souza, Oliveira-Santos (bib0032) 2017; 61
Song, Wang, Zhao, Lam (bib0011) 2018
Chiang, Chen (bib0020) 2012; 21
Huang, Liu, van der Matten, Weinberger (bib0033) 2017
Ancuti, Ancuti, Vleeschouwer (bib0018) 2018; 27
Akkaynak, Treibitz (bib0001) 2018
He, Sun, Tang (bib0023) 2011; 33
Chikkerur, Cartwright, Govindaraju (bib0002) 2007; 40
Li, Guo, Cong, Pang, Wang (bib0025) 2016; 25
Silberman, Hoiem, Kohli, Fergus (bib0037) 2012
Drews, Nascimento, Botelho, Campos (bib0022) 2016; 36
Li, Guo, Guo (bib0016) 2018; 25
Wu, Shen, Hengel (bib0008) 2019; 90
Wang, Bovik, Sherikh, Simoncelli (bib0038) 2004; 13
Wang, Shen, Porikli, Yang (bib0006) 2019; 41
Li, Guo, Ren, Cong, Hou, Kwong (bib0019) 2019
Panetta, Gao, Agaian (bib0040) 2016; 41
Peng (10.1016/j.patcog.2019.107038_bib0026) 2017; 26
Li (10.1016/j.patcog.2019.107038_bib0025) 2016; 25
Yang (10.1016/j.patcog.2019.107038_bib0005) 2019; 86
Li (10.1016/j.patcog.2019.107038_bib0027) 2017; 3
Huang (10.1016/j.patcog.2019.107038_bib0033) 2017
Galdran (10.1016/j.patcog.2019.107038_bib0021) 2015; 26
Huang (10.1016/j.patcog.2019.107038_bib0031) 2010; 43
Li (10.1016/j.patcog.2019.107038_sbref0019) 2019
Wang (10.1016/j.patcog.2019.107038_bib0038) 2004; 13
Ancuti (10.1016/j.patcog.2019.107038_bib0014) 2012
Wu (10.1016/j.patcog.2019.107038_bib0008) 2019; 90
Li (10.1016/j.patcog.2019.107038_bib0016) 2018; 25
Drews (10.1016/j.patcog.2019.107038_bib0022) 2016; 36
Silberman (10.1016/j.patcog.2019.107038_bib0037) 2012
Berman (10.1016/j.patcog.2019.107038_bib0009) 2017
Sheinin (10.1016/j.patcog.2019.107038_bib0028) 2017
Lore (10.1016/j.patcog.2019.107038_bib0003) 2017; 61
Song (10.1016/j.patcog.2019.107038_bib0011) 2018
Wang (10.1016/j.patcog.2019.107038_bib0006) 2019; 41
He (10.1016/j.patcog.2019.107038_bib0023) 2011; 33
Li (10.1016/j.patcog.2019.107038_bib0015) 2017; 94
Lopes (10.1016/j.patcog.2019.107038_bib0032) 2017; 61
Li (10.1016/j.patcog.2019.107038_bib0036) 2019
Li (10.1016/j.patcog.2019.107038_bib0024) 2016
Panetta (10.1016/j.patcog.2019.107038_bib0040) 2016; 41
Ancuti (10.1016/j.patcog.2019.107038_bib0018) 2018; 27
Chiang (10.1016/j.patcog.2019.107038_bib0020) 2012; 21
Li (10.1016/j.patcog.2019.107038_bib0029) 2019
Chikkerur (10.1016/j.patcog.2019.107038_bib0002) 2007; 40
Wang (10.1016/j.patcog.2019.107038_bib0004) 2018
Gu (10.1016/j.patcog.2019.107038_bib0012) 2018; 77
Zhou (10.1016/j.patcog.2019.107038_bib0035) 2019; 86
Guo (10.1016/j.patcog.2019.107038_bib0017) 2019
Wang (10.1016/j.patcog.2019.107038_bib0010) 2018; 27
Wang (10.1016/j.patcog.2019.107038_bib0034) 2018
Li (10.1016/j.patcog.2019.107038_bib0013) 2015; 24
Shen (10.1016/j.patcog.2019.107038_bib0030) 2018
Yang (10.1016/j.patcog.2019.107038_bib0039) 2015; 24
Akkaynak (10.1016/j.patcog.2019.107038_bib0001) 2018
Guo (10.1016/j.patcog.2019.107038_bib0007) 2019; 28
References_xml – start-page: 6723
  year: 2018
  end-page: 6732
  ident: bib0001
  article-title: A revised underwater image formation
  publication-title: Proc. IEEE Int. Conf. Comput. Vis. Pattern Recognit. (CVPR)
– volume: 21
  start-page: 1756
  year: 2012
  end-page: 1769
  ident: bib0020
  article-title: Underwater image enhancement by wavelength compensation and dehazing
  publication-title: IEEE Trans. Image Process.
– start-page: 715
  year: 2018
  end-page: 731
  ident: bib0011
  article-title: Pyramid dilated deeper convlstm for video salient object detection
  publication-title: Proc. Eur. Conf. Comput. Vis. (ECCV)
– start-page: 1993
  year: 2016
  end-page: 1997
  ident: bib0024
  article-title: Underwater image restoration based on minimum information loss principle and optical properties of underwater imaging
  publication-title: Proc. IEEE Int. Conf. Image Process. (ICIP)
– start-page: 4271
  year: 2018
  end-page: 4280
  ident: bib0004
  article-title: Attentive fashion grammar network for fashion landmark detection and clothing category classification
  publication-title: Proc. IEEE Int. Conf. Comput. Vis. Pattern Recognit. (CVPR)
– start-page: 746
  year: 2012
  end-page: 760
  ident: bib0037
  article-title: Indoor segmentation and support inference from rgbd images
  publication-title: Proc. Eur. Conf. Comput. Vis. (ECCV)
– volume: 94
  start-page: 62
  year: 2017
  end-page: 67
  ident: bib0015
  article-title: A hybrid method for underwater image correction
  publication-title: Pattern Recognit. Lett.
– volume: 28
  start-page: 2545
  year: 2019
  end-page: 2557
  ident: bib0007
  article-title: Hierarchical features driven residual learning for depth map super-resolution
  publication-title: IEEE Trans. Image Process.
– volume: 24
  year: 2015
  ident: bib0013
  article-title: Underwater image enhancement by dehazing and color correction
  publication-title: J. Electron. Imag.
– start-page: 4700
  year: 2017
  end-page: 4708
  ident: bib0033
  article-title: Densely connected convolutional networks
  publication-title: Proc. IEEE Int. Conf. Comput. Vis. Pattern Recognit. (CVPR)
– volume: 61
  start-page: 610
  year: 2017
  end-page: 628
  ident: bib0032
  article-title: Facial expression recognition with convolutional neural networks: croping with few data and the training sample order
  publication-title: Pattern Recognit.
– start-page: 1
  year: 2019
  ident: bib0036
  article-title: Nested network with two-stream pyramid for salient object detection in optical remote sensing images
  publication-title: IEEE Trans. Geosci. Remote Sens.
– volume: 26
  start-page: 132
  year: 2015
  end-page: 145
  ident: bib0021
  article-title: Automatic red-channel underwater image restoration
  publication-title: Vis. Commun. Image Rep.
– volume: 40
  start-page: 198
  year: 2007
  end-page: 211
  ident: bib0002
  article-title: Fingerprint enhancement using stft analysis
  publication-title: Pattern Recognit.
– volume: 27
  start-page: 2368
  year: 2018
  end-page: 2378
  ident: bib0010
  article-title: Deep visual attention prediction
  publication-title: IEEE Trans. Image Process.
– volume: 77
  start-page: 354
  year: 2018
  end-page: 377
  ident: bib0012
  article-title: Recent advances in convolutional neural networks
  publication-title: Pattern Recognit.
– start-page: 1
  year: 2019
  ident: bib0029
  article-title: Pdr-net: perception-inspired single image dehazing network with refinement
  publication-title: IEEE Trans. Multimed.
– volume: 86
  start-page: 143
  year: 2019
  end-page: 155
  ident: bib0005
  article-title: Attentive driven person re-identification
  publication-title: Pattern Recognit.
– volume: 86
  start-page: 99
  year: 2019
  end-page: 111
  ident: bib0035
  article-title: Multi-label learning of part detectors for occluded pedestrian detection
  publication-title: Pattern Recognit.
– volume: 27
  start-page: 379
  year: 2018
  end-page: 393
  ident: bib0018
  article-title: Color balance and fusion for underwater image enhancement
  publication-title: IEEE Trans. Image Process.
– start-page: 4894
  year: 2018
  end-page: 4903
  ident: bib0034
  article-title: Revisiting video saliency: a large-scale benchmark and a new model
  publication-title: Proc. IEEE Int. Conf. Comput. Vis. Pattern Recognit. (CVPR)
– start-page: 8260
  year: 2018
  end-page: 8269
  ident: bib0030
  article-title: Deep semantic face deblurring
  publication-title: Proc. IEEE Int. Conf. Comput. Vis. Pattern Recognit. (CVPR)
– volume: 90
  start-page: 119
  year: 2019
  end-page: 133
  ident: bib0008
  article-title: Wider or deeper: revisiting the resnet model for visual recognition
  publication-title: Pattern Recognit.
– volume: 3
  start-page: 387
  year: 2017
  end-page: 394
  ident: bib0027
  article-title: Watergan: unsupervised generative network to enable real-time color correction of monocular underwater images
  publication-title: IEEE Robot. Autom. Lett.
– volume: 26
  start-page: 1579
  year: 2017
  end-page: 1594
  ident: bib0026
  article-title: Underwater image restoration based on image blurriness and light absorption
  publication-title: IEEE Trans. Image Process.
– volume: 41
  start-page: 541
  year: 2016
  end-page: 551
  ident: bib0040
  article-title: Human-visual-system-inspired underwater image quality measures
  publication-title: IEEE J. Ocean. Eng.
– volume: 25
  start-page: 323
  year: 2018
  end-page: 327
  ident: bib0016
  article-title: Emerging from water: underwater image color correction based on weakly supervised color transfer
  publication-title: IEEE Signal Process. Lett.
– start-page: 1
  year: 2019
  end-page: 9
  ident: bib0017
  article-title: Underwater image enhancement using a multiscale dense generative adversarial network
  publication-title: IEEE J. Ocean. Engineer.
– volume: 36
  start-page: 24
  year: 2016
  end-page: 35
  ident: bib0022
  article-title: Underwater depth estimation and image restoration based on single images
  publication-title: IEEE Comput. Graph. Appl.
– volume: 61
  start-page: 650
  year: 2017
  end-page: 662
  ident: bib0003
  article-title: Llnet: a deep autoencoder approach to natural low-light image enhancement
  publication-title: Pattern Recognit.
– start-page: 1
  year: 2017
  end-page: 11
  ident: bib0009
  article-title: Diving into haze-lines: color restoration of underwater images
  publication-title: Proc. Brit. Mach. Vis. Conf. (BMVC)
– volume: 25
  start-page: 5664
  year: 2016
  end-page: 5677
  ident: bib0025
  article-title: Underwater image enhancement by dehazing with minimum information loss and histogram distribution prior
  publication-title: IEEE Trans. Image Process.
– volume: 41
  start-page: 985
  year: 2019
  end-page: 998
  ident: bib0006
  article-title: Semi-supervised video object segmentation with super-trajectories
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
– start-page: 1436
  year: 2017
  end-page: 1443
  ident: bib0028
  article-title: The next best underwater view
  publication-title: Proc. IEEE Int. Conf. Comput. Vis. Pattern Recognit. (CVPR)
– volume: 13
  start-page: 600
  year: 2004
  end-page: 612
  ident: bib0038
  article-title: Image quality assessment: from error visibility to structural similarity
  publication-title: IEEE Trans. Image Process.
– start-page: 81
  year: 2012
  end-page: 88
  ident: bib0014
  article-title: Enhancing underwater images and videos by fusion
  publication-title: Proc. IEEE Int. Conf. Comput. Vis. Pattern Recognit. (CVPR)
– volume: 43
  start-page: 2532
  year: 2010
  end-page: 2543
  ident: bib0031
  article-title: Super-resolution of human face image using canonical correlation analysis
  publication-title: Pattern Recognit.
– volume: 33
  start-page: 2341
  year: 2011
  end-page: 2343
  ident: bib0023
  article-title: Single image haze removal using dark channel prior
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
– volume: 24
  start-page: 6062
  year: 2015
  end-page: 6071
  ident: bib0039
  article-title: An underwater color image quality evaluation metric
  publication-title: IEEE Trans. Image Process.
– year: 2019
  ident: bib0019
  publication-title: An underwater image enhancement benchmark dataset and beyond
– volume: 41
  start-page: 985
  issue: 4
  year: 2019
  ident: 10.1016/j.patcog.2019.107038_bib0006
  article-title: Semi-supervised video object segmentation with super-trajectories
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2018.2819173
– volume: 27
  start-page: 2368
  issue: 5
  year: 2018
  ident: 10.1016/j.patcog.2019.107038_bib0010
  article-title: Deep visual attention prediction
  publication-title: IEEE Trans. Image Process.
  doi: 10.1109/TIP.2017.2787612
– volume: 25
  start-page: 5664
  issue: 12
  year: 2016
  ident: 10.1016/j.patcog.2019.107038_bib0025
  article-title: Underwater image enhancement by dehazing with minimum information loss and histogram distribution prior
  publication-title: IEEE Trans. Image Process.
  doi: 10.1109/TIP.2016.2612882
– start-page: 4700
  year: 2017
  ident: 10.1016/j.patcog.2019.107038_bib0033
  article-title: Densely connected convolutional networks
– start-page: 1
  year: 2019
  ident: 10.1016/j.patcog.2019.107038_bib0036
  article-title: Nested network with two-stream pyramid for salient object detection in optical remote sensing images
  publication-title: IEEE Trans. Geosci. Remote Sens.
– volume: 21
  start-page: 1756
  issue: 4
  year: 2012
  ident: 10.1016/j.patcog.2019.107038_bib0020
  article-title: Underwater image enhancement by wavelength compensation and dehazing
  publication-title: IEEE Trans. Image Process.
  doi: 10.1109/TIP.2011.2179666
– start-page: 1
  year: 2017
  ident: 10.1016/j.patcog.2019.107038_bib0009
  article-title: Diving into haze-lines: color restoration of underwater images
– volume: 41
  start-page: 541
  issue: 3
  year: 2016
  ident: 10.1016/j.patcog.2019.107038_bib0040
  article-title: Human-visual-system-inspired underwater image quality measures
  publication-title: IEEE J. Ocean. Eng.
  doi: 10.1109/JOE.2015.2469915
– volume: 86
  start-page: 99
  year: 2019
  ident: 10.1016/j.patcog.2019.107038_bib0035
  article-title: Multi-label learning of part detectors for occluded pedestrian detection
  publication-title: Pattern Recognit.
  doi: 10.1016/j.patcog.2018.08.018
– start-page: 715
  year: 2018
  ident: 10.1016/j.patcog.2019.107038_bib0011
  article-title: Pyramid dilated deeper convlstm for video salient object detection
– volume: 25
  start-page: 323
  issue: 3
  year: 2018
  ident: 10.1016/j.patcog.2019.107038_bib0016
  article-title: Emerging from water: underwater image color correction based on weakly supervised color transfer
  publication-title: IEEE Signal Process. Lett.
  doi: 10.1109/LSP.2018.2792050
– volume: 40
  start-page: 198
  issue: 1
  year: 2007
  ident: 10.1016/j.patcog.2019.107038_bib0002
  article-title: Fingerprint enhancement using stft analysis
  publication-title: Pattern Recognit.
  doi: 10.1016/j.patcog.2006.05.036
– volume: 77
  start-page: 354
  year: 2018
  ident: 10.1016/j.patcog.2019.107038_bib0012
  article-title: Recent advances in convolutional neural networks
  publication-title: Pattern Recognit.
  doi: 10.1016/j.patcog.2017.10.013
– volume: 28
  start-page: 2545
  issue: 5
  year: 2019
  ident: 10.1016/j.patcog.2019.107038_bib0007
  article-title: Hierarchical features driven residual learning for depth map super-resolution
  publication-title: IEEE Trans. Image Process.
  doi: 10.1109/TIP.2018.2887029
– volume: 90
  start-page: 119
  year: 2019
  ident: 10.1016/j.patcog.2019.107038_bib0008
  article-title: Wider or deeper: revisiting the resnet model for visual recognition
  publication-title: Pattern Recognit.
  doi: 10.1016/j.patcog.2019.01.006
– volume: 26
  start-page: 132
  year: 2015
  ident: 10.1016/j.patcog.2019.107038_bib0021
  article-title: Automatic red-channel underwater image restoration
  publication-title: Vis. Commun. Image Rep.
  doi: 10.1016/j.jvcir.2014.11.006
– volume: 61
  start-page: 650
  year: 2017
  ident: 10.1016/j.patcog.2019.107038_bib0003
  article-title: Llnet: a deep autoencoder approach to natural low-light image enhancement
  publication-title: Pattern Recognit.
  doi: 10.1016/j.patcog.2016.06.008
– volume: 86
  start-page: 143
  year: 2019
  ident: 10.1016/j.patcog.2019.107038_bib0005
  article-title: Attentive driven person re-identification
  publication-title: Pattern Recognit.
  doi: 10.1016/j.patcog.2018.08.015
– start-page: 1436
  year: 2017
  ident: 10.1016/j.patcog.2019.107038_bib0028
  article-title: The next best underwater view
– year: 2019
  ident: 10.1016/j.patcog.2019.107038_sbref0019
  publication-title: An underwater image enhancement benchmark dataset and beyond
– volume: 61
  start-page: 610
  issue: 1
  year: 2017
  ident: 10.1016/j.patcog.2019.107038_bib0032
  article-title: Facial expression recognition with convolutional neural networks: croping with few data and the training sample order
  publication-title: Pattern Recognit.
  doi: 10.1016/j.patcog.2016.07.026
– start-page: 4894
  year: 2018
  ident: 10.1016/j.patcog.2019.107038_bib0034
  article-title: Revisiting video saliency: a large-scale benchmark and a new model
– start-page: 1
  year: 2019
  ident: 10.1016/j.patcog.2019.107038_bib0017
  article-title: Underwater image enhancement using a multiscale dense generative adversarial network
  publication-title: IEEE J. Ocean. Engineer.
– volume: 33
  start-page: 2341
  issue: 12
  year: 2011
  ident: 10.1016/j.patcog.2019.107038_bib0023
  article-title: Single image haze removal using dark channel prior
  publication-title: IEEE Trans. Pattern Anal. Mach. Intell.
  doi: 10.1109/TPAMI.2010.168
– volume: 3
  start-page: 387
  issue: 1
  year: 2017
  ident: 10.1016/j.patcog.2019.107038_bib0027
  article-title: Watergan: unsupervised generative network to enable real-time color correction of monocular underwater images
  publication-title: IEEE Robot. Autom. Lett.
– start-page: 6723
  year: 2018
  ident: 10.1016/j.patcog.2019.107038_bib0001
  article-title: A revised underwater image formation
– start-page: 81
  year: 2012
  ident: 10.1016/j.patcog.2019.107038_bib0014
  article-title: Enhancing underwater images and videos by fusion
– start-page: 4271
  year: 2018
  ident: 10.1016/j.patcog.2019.107038_bib0004
  article-title: Attentive fashion grammar network for fashion landmark detection and clothing category classification
– volume: 26
  start-page: 1579
  issue: 4
  year: 2017
  ident: 10.1016/j.patcog.2019.107038_bib0026
  article-title: Underwater image restoration based on image blurriness and light absorption
  publication-title: IEEE Trans. Image Process.
  doi: 10.1109/TIP.2017.2663846
– start-page: 746
  year: 2012
  ident: 10.1016/j.patcog.2019.107038_bib0037
  article-title: Indoor segmentation and support inference from rgbd images
– volume: 24
  start-page: 6062
  issue: 12
  year: 2015
  ident: 10.1016/j.patcog.2019.107038_bib0039
  article-title: An underwater color image quality evaluation metric
  publication-title: IEEE Trans. Image Process.
  doi: 10.1109/TIP.2015.2491020
– start-page: 1993
  year: 2016
  ident: 10.1016/j.patcog.2019.107038_bib0024
  article-title: Underwater image restoration based on minimum information loss principle and optical properties of underwater imaging
– volume: 27
  start-page: 379
  issue: 1
  year: 2018
  ident: 10.1016/j.patcog.2019.107038_bib0018
  article-title: Color balance and fusion for underwater image enhancement
  publication-title: IEEE Trans. Image Process.
  doi: 10.1109/TIP.2017.2759252
– volume: 43
  start-page: 2532
  issue: 7
  year: 2010
  ident: 10.1016/j.patcog.2019.107038_bib0031
  article-title: Super-resolution of human face image using canonical correlation analysis
  publication-title: Pattern Recognit.
  doi: 10.1016/j.patcog.2010.02.007
– volume: 24
  issue: 3
  year: 2015
  ident: 10.1016/j.patcog.2019.107038_bib0013
  article-title: Underwater image enhancement by dehazing and color correction
  publication-title: J. Electron. Imag.
  doi: 10.1117/1.JEI.24.3.033023
– start-page: 1
  year: 2019
  ident: 10.1016/j.patcog.2019.107038_bib0029
  article-title: Pdr-net: perception-inspired single image dehazing network with refinement
  publication-title: IEEE Trans. Multimed.
– volume: 13
  start-page: 600
  issue: 4
  year: 2004
  ident: 10.1016/j.patcog.2019.107038_bib0038
  article-title: Image quality assessment: from error visibility to structural similarity
  publication-title: IEEE Trans. Image Process.
  doi: 10.1109/TIP.2003.819861
– volume: 94
  start-page: 62
  year: 2017
  ident: 10.1016/j.patcog.2019.107038_bib0015
  article-title: A hybrid method for underwater image correction
  publication-title: Pattern Recognit. Lett.
  doi: 10.1016/j.patrec.2017.05.023
– volume: 36
  start-page: 24
  issue: 2
  year: 2016
  ident: 10.1016/j.patcog.2019.107038_bib0022
  article-title: Underwater depth estimation and image restoration based on single images
  publication-title: IEEE Comput. Graph. Appl.
  doi: 10.1109/MCG.2016.26
– start-page: 8260
  year: 2018
  ident: 10.1016/j.patcog.2019.107038_bib0030
  article-title: Deep semantic face deblurring
SSID ssj0017142
Score 2.7120047
Snippet •Underwater image and video synthesis approach is desired by data-driven methods.•Underwater scene prior is helpful for underwater image and video...
SourceID crossref
elsevier
SourceType Enrichment Source
Index Database
Publisher
StartPage 107038
SubjectTerms Deep learning
Pattern recognition
Underwater image and video enhancement and restoration
Underwater image synthesis
Title Underwater scene prior inspired deep underwater image and video enhancement
URI https://dx.doi.org/10.1016/j.patcog.2019.107038
Volume 98
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV07T8MwED5VsLDwRpRH5YHVtImdxB6riqpQ0YlK3SLbcSAIkqhqxcZvx5dHAQmBxHrySdGX8-dzct8dwFUQpYFWNqWaaUk58wUV2O6OG-2F1mdSWxQ438_CyZzfLYJFB0atFgbLKhvurzm9YuvG0m_Q7JdZhhpfbDuIKhxHw7zScHEeYZRfv2_KPHC-d90xnHkUV7fyuarGq3R0VzxigZd0Jhf84ufj6cuRM96H3SZXJMP6cQ6gY_ND2GvnMJBmWx7BtBpe9KbQiM2ZLCmXWbEkWY6_0W1CEmtLsv5clL06FiEqTwiq8Api8yd8-fih8Bjm45uH0YQ2QxKocdn-iiouQndniIxIB5ypUChP2MRPmIkUl9rTKWPKKIe7dIdRKBLPKimUP1C-wfyFncBWXuT2FEgkI81YIFLuWZ5aLpUvE8-wQIci0Ex2gbXYxKbpII6DLF7itlTsOa4RjRHRuEa0C3TjVdYdNP5YH7Wwx98iIXYk_6vn2b89z2HHx3t0VY19AVur5dpeumRjpXtVNPVge3g7ncw-AMnz07s
linkProvider Elsevier
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV07T8MwED6VMsDCG1GeHmAMENtJ7IEB8VChwFQktmAnDgRBGpVWFQt_ij-IL48CEgIJidWyI-fL5btz8t0dwLYXJJ5WJnE009LhjApHYLk7HmnXN5RJbTDB-fLKb1_z8xvvpgFvdS4Myior7i85vWDramSvQnMvT1PM8cWyg5iFY2nYHhMqZWXHvIzsue354OzYPuQdSk9Pukdtp2ot4EQ2Rh44igvfRtpBJJJ9zpQvlCtMTGMWBYpL7eqEMRUpu1tpKdwXsWuUFIruKxqh12f2uhMwyS1dYNuE3dexrgQbipclypnr4PbqfL1CVJZbfu3doaJM2iH7tonv_eEnH3c6BzNVcEoOy_ufh4bJFmC2bvxAKh5YhE7RLWmkcBCrQRmS99Nen6QZ_rc3MYmNycnwY1L6ZGmLqCwmmPbXIya7R2vDL5NLcP0v0C1DM-tlZgVIIAPNmCcS7hqeGC4VlbEbMU_7wtNMtoDV2IRRVbIcO2c8hrU27SEsEQ0R0bBEtAXOeFVeluz4ZX5Qwx5-Mb3QepUfV67-eeUWTLW7lxfhxdlVZw2mKR7iCyn4OjQH_aHZsJHOQG8WlkXg9r9N-R0LMg6s
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Underwater+scene+prior+inspired+deep+underwater+image+and+video+enhancement&rft.jtitle=Pattern+recognition&rft.au=Li%2C+Chongyi&rft.au=Anwar%2C+Saeed&rft.au=Porikli%2C+Fatih&rft.date=2020-02-01&rft.pub=Elsevier+Ltd&rft.issn=0031-3203&rft.eissn=1873-5142&rft.volume=98&rft_id=info:doi/10.1016%2Fj.patcog.2019.107038&rft.externalDocID=S0031320319303401
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0031-3203&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0031-3203&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0031-3203&client=summon