Towards Low Light Enhancement With RAW Images
In this paper, we make the first benchmark effort to elaborate on the superiority of using RAW images in the low light enhancement and develop a novel alternative route to utilize RAW images in a more flexible and practical way. Inspired by a full consideration on the typical image processing pipeli...
Saved in:
Published in | IEEE transactions on image processing Vol. 31; pp. 1391 - 1405 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | In this paper, we make the first benchmark effort to elaborate on the superiority of using RAW images in the low light enhancement and develop a novel alternative route to utilize RAW images in a more flexible and practical way. Inspired by a full consideration on the typical image processing pipeline, we are inspired to develop a new evaluation framework, Factorized Enhancement Model ( FEM ), which decomposes the properties of RAW images into measurable factors and provides a tool for exploring how properties of RAW images affect the enhancement performance empirically. The empirical benchmark results show that the Linearity of data and Exposure Time recorded in meta-data play the most critical role, which brings distinct performance gains in various measures over the approaches taking the sRGB images as input. With the insights obtained from the benchmark results in mind, a RAW-guiding Exposure Enhancement Network (REENet) is developed, which makes trade-offs between the advantages and inaccessibility of RAW images in real applications in a way of using RAW images only in the training phase. REENet projects sRGB images into linear RAW domains to apply constraints with corresponding RAW images to reduce the difficulty of modeling training. After that, in the testing phase, our REENet does not rely on RAW images. Experimental results demonstrate not only the superiority of REENet to state-of-the-art sRGB-based methods and but also the effectiveness of the RAW guidance and all components. |
---|---|
AbstractList | In this paper, we make the first benchmark effort to elaborate on the superiority of using RAW images in the low light enhancement and develop a novel alternative route to utilize RAW images in a more flexible and practical way. Inspired by a full consideration on the typical image processing pipeline, we are inspired to develop a new evaluation framework, Factorized Enhancement Model ( FEM ), which decomposes the properties of RAW images into measurable factors and provides a tool for exploring how properties of RAW images affect the enhancement performance empirically. The empirical benchmark results show that the Linearity of data and Exposure Time recorded in meta-data play the most critical role, which brings distinct performance gains in various measures over the approaches taking the sRGB images as input. With the insights obtained from the benchmark results in mind, a RAW-guiding Exposure Enhancement Network (REENet) is developed, which makes trade-offs between the advantages and inaccessibility of RAW images in real applications in a way of using RAW images only in the training phase. REENet projects sRGB images into linear RAW domains to apply constraints with corresponding RAW images to reduce the difficulty of modeling training. After that, in the testing phase, our REENet does not rely on RAW images. Experimental results demonstrate not only the superiority of REENet to state-of-the-art sRGB-based methods and but also the effectiveness of the RAW guidance and all components. In this paper, we make the first benchmark effort to elaborate on the superiority of using RAW images in the low light enhancement and develop a novel alternative route to utilize RAW images in a more flexible and practical way. Inspired by a full consideration on the typical image processing pipeline, we are inspired to develop a new evaluation framework, Factorized Enhancement Model (FEM), which decomposes the properties of RAW images into measurable factors and provides a tool for exploring how properties of RAW images affect the enhancement performance empirically. The empirical benchmark results show that the Linearity of data and Exposure Time recorded in meta-data play the most critical role, which brings distinct performance gains in various measures over the approaches taking the sRGB images as input. With the insights obtained from the benchmark results in mind, a RAW-guiding Exposure Enhancement Network (REENet) is developed, which makes trade-offs between the advantages and inaccessibility of RAW images in real applications in a way of using RAW images only in the training phase. REENet projects sRGB images into linear RAW domains to apply constraints with corresponding RAW images to reduce the difficulty of modeling training. After that, in the testing phase, our REENet does not rely on RAW images. Experimental results demonstrate not only the superiority of REENet to state-of-the-art sRGB-based methods and but also the effectiveness of the RAW guidance and all components.In this paper, we make the first benchmark effort to elaborate on the superiority of using RAW images in the low light enhancement and develop a novel alternative route to utilize RAW images in a more flexible and practical way. Inspired by a full consideration on the typical image processing pipeline, we are inspired to develop a new evaluation framework, Factorized Enhancement Model (FEM), which decomposes the properties of RAW images into measurable factors and provides a tool for exploring how properties of RAW images affect the enhancement performance empirically. The empirical benchmark results show that the Linearity of data and Exposure Time recorded in meta-data play the most critical role, which brings distinct performance gains in various measures over the approaches taking the sRGB images as input. With the insights obtained from the benchmark results in mind, a RAW-guiding Exposure Enhancement Network (REENet) is developed, which makes trade-offs between the advantages and inaccessibility of RAW images in real applications in a way of using RAW images only in the training phase. REENet projects sRGB images into linear RAW domains to apply constraints with corresponding RAW images to reduce the difficulty of modeling training. After that, in the testing phase, our REENet does not rely on RAW images. Experimental results demonstrate not only the superiority of REENet to state-of-the-art sRGB-based methods and but also the effectiveness of the RAW guidance and all components. |
Author | Duan, Ling-Yu Huang, Haofeng Liu, Jiaying Hu, Yueyu Yang, Wenhan |
Author_xml | – sequence: 1 givenname: Haofeng surname: Huang fullname: Huang, Haofeng email: hhf@pku.edu.cn organization: Peking University, Beijing, China – sequence: 2 givenname: Wenhan orcidid: 0000-0002-1692-0069 surname: Yang fullname: Yang, Wenhan email: yangwenhan@pku.edu.cn organization: Peking University, Beijing, China – sequence: 3 givenname: Yueyu orcidid: 0000-0003-4919-4515 surname: Hu fullname: Hu, Yueyu email: huyy@pku.edu.cn organization: Peking University, Beijing, China – sequence: 4 givenname: Jiaying orcidid: 0000-0002-0468-9576 surname: Liu fullname: Liu, Jiaying email: liujiaying@pku.edu.cn organization: Peking University, Beijing, China – sequence: 5 givenname: Ling-Yu orcidid: 0000-0002-4491-2023 surname: Duan fullname: Duan, Ling-Yu email: lingyu@pku.edu.cn organization: Peking University, Beijing, China |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/35038292$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kE1Lw0AQhhdR_L4LggS8eEmdmd1NskcRPwoFRSo9LrvJxEaaRLMp4r83pa0HD55mDs_78vIcid2mbViIM4QRIpjr6fh5REA0kqggQdgRh2gUxgCKdocfdBqnqMyBOArhHQCVxmRfHEgNMiNDhyKetl-uK0I0ab-iSfU276O7Zu6anGtu-mhW9fPo5WYWjWv3xuFE7JVuEfh0c4_F6_3d9PYxnjw9jG9vJnEuVdrHWLCk3GelLnxJGozSnAKQl5iostDDRJ9qbbLSU5p6k8lcSu80IxkvOZfH4mrd-9G1n0sOva2rkPNi4Rpul8FSQgiYEeoBvfyDvrfLrhnWrSjKSGcEA3WxoZa-5sJ-dFXtum-7FTEAyRrIuzaEjkubV73rq7bpO1ctLIJdGbeDcbsybjfGhyD8CW67_4mcryMVM__iJskUyVT-ADZchuo |
CODEN | IIPRE4 |
CitedBy_id | crossref_primary_10_1049_ipr2_12771 crossref_primary_10_1109_TIM_2025_3544706 crossref_primary_10_1109_JSEN_2024_3396195 crossref_primary_10_1109_ACCESS_2023_3328534 crossref_primary_10_1007_s00371_023_03249_3 crossref_primary_10_1038_s41598_024_69106_y crossref_primary_10_1016_j_compeleceng_2023_108608 crossref_primary_10_1109_TCSVT_2023_3311766 crossref_primary_10_11834_jig_230794 crossref_primary_10_1109_TMM_2024_3400668 crossref_primary_10_1007_s11263_024_02143_2 crossref_primary_10_1109_TPAMI_2024_3382108 crossref_primary_10_1109_ACCESS_2023_3301614 crossref_primary_10_1109_TPAMI_2023_3301502 crossref_primary_10_1109_TCE_2024_3476033 crossref_primary_10_1109_TMM_2023_3293736 crossref_primary_10_1109_TMM_2023_3278385 crossref_primary_10_3390_app15010361 crossref_primary_10_1109_JSYST_2023_3262593 crossref_primary_10_1038_s41598_025_87412_x crossref_primary_10_1016_j_image_2023_117060 crossref_primary_10_1109_TIM_2024_3497140 crossref_primary_10_1145_3705319 crossref_primary_10_3390_s23187763 crossref_primary_10_1038_s41598_024_72912_z |
Cites_doi | 10.1109/TIP.2009.2021548 10.1109/CVPR.2019.01129 10.1007/978-0-387-31439-6_482 10.1109/CVPR.2004.1315266 10.1109/ICCV.2019.00742 10.1016/j.patcog.2016.06.008 10.1109/TIP.2018.2794218 10.1109/VBC.1990.109340 10.1109/TIP.2021.3051462 10.1109/TIP.2016.2639450 10.1109/CVPR.2018.00347 10.1109/CVPR42600.2020.00277 10.1109/CVPR42600.2020.00313 10.1109/ICCV.2019.00328 10.1109/TVCG.2015.2461157 10.1109/TIP.2013.2261309 10.1109/TIP.2003.819861 10.1109/CVPR46437.2021.00904 10.1109/SITIS.2013.19 10.1109/CVPRW.2018.00121 10.1109/TCE.2007.4429280 10.1109/CVPR.2012.6247952 10.1109/ICME.2011.6012107 10.1109/83.557356 10.1109/CVPR.2018.00193 10.1109/TIP.2013.2284059 10.1016/j.sigpro.2016.05.031 10.1609/aaai.v34i07.7013 10.1109/TPAMI.2021.3070580 10.1109/TCE.2007.381734 10.1109/TCSVT.2017.2763180 10.1145/3072959.3073592 10.1109/VCIP.2017.8305143 10.1109/CVPR46437.2021.00016 10.1109/ICASSP.2014.6853785 10.1109/TIP.2005.859378 10.1109/LSP.2012.2227726 10.1109/ISCAS.2018.8351427 10.1109/TCSVT.2013.2276154 10.1109/TIP.2018.2872858 10.1109/CVPR.2019.00701 10.1109/CVPR42600.2020.00283 10.1007/978-3-319-24574-4_28 10.1109/ISPACS.2013.6704591 10.1109/83.597272 10.1109/TIP.2011.2157513 10.1109/TIP.2008.2001399 10.1145/3343031.3350926 10.1109/CVPR42600.2020.00185 10.1109/CVPR.2018.00068 10.1109/TIP.2018.2810539 10.1109/CVPR42600.2020.00235 10.1109/ICIP.2015.7351501 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2022 |
DBID | 97E RIA RIE AAYXX CITATION NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
DOI | 10.1109/TIP.2022.3140610 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitle | CrossRef PubMed Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional MEDLINE - Academic |
DatabaseTitleList | Technology Research Database PubMed MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Applied Sciences Engineering |
EISSN | 1941-0042 |
EndPage | 1405 |
ExternalDocumentID | 35038292 10_1109_TIP_2022_3140610 9684237 |
Genre | orig-research Journal Article |
GrantInformation_xml | – fundername: National Key Research and Development Program of China grantid: 2018AAA0102702 funderid: 10.13039/501100012166 – fundername: Peking University (PKU)-Nanyang Technological University (NTU) Joint Research Institute (JRI) by the Ng Teng Fong Charitable Foundation funderid: 10.13039/501100007937 – fundername: National Natural Science Foundation of China grantid: 62088102; 62172020 funderid: 10.13039/501100001809 – fundername: State Key Laboratory of Media Convergence Production Technology and Systems |
GroupedDBID | --- -~X .DC 0R~ 29I 4.4 53G 5GY 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABFSI ABQJQ ABVLG ACGFO ACGFS ACIWK AENEX AETIX AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 E.L EBS EJD F5P HZ~ H~9 ICLAB IFIPE IFJZH IPLJI JAVBF LAI M43 MS~ O9- OCL P2P RIA RIE RNS TAE TN5 VH1 AAYOK AAYXX CITATION RIG NPM PKN Z5M 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 |
ID | FETCH-LOGICAL-c347t-1de32cb8f5dbf250945e7002b3164fd5941b75598fb277b983c33ba5e129b3ec3 |
IEDL.DBID | RIE |
ISSN | 1057-7149 1941-0042 |
IngestDate | Thu Jul 10 19:07:01 EDT 2025 Mon Jun 30 10:10:41 EDT 2025 Wed Feb 19 02:27:26 EST 2025 Tue Jul 01 02:03:27 EDT 2025 Thu Apr 24 23:04:29 EDT 2025 Wed Aug 27 03:00:23 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c347t-1de32cb8f5dbf250945e7002b3164fd5941b75598fb277b983c33ba5e129b3ec3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0003-4919-4515 0000-0002-0468-9576 0000-0002-1692-0069 0000-0002-4491-2023 |
PMID | 35038292 |
PQID | 2622825820 |
PQPubID | 85429 |
PageCount | 15 |
ParticipantIDs | proquest_miscellaneous_2621018215 proquest_journals_2622825820 ieee_primary_9684237 crossref_citationtrail_10_1109_TIP_2022_3140610 crossref_primary_10_1109_TIP_2022_3140610 pubmed_primary_35038292 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 20220000 2022-00-00 20220101 |
PublicationDateYYYYMMDD | 2022-01-01 |
PublicationDate_xml | – year: 2022 text: 20220000 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: New York |
PublicationTitle | IEEE transactions on image processing |
PublicationTitleAbbrev | TIP |
PublicationTitleAlternate | IEEE Trans Image Process |
PublicationYear | 2022 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref57 ref56 ref15 ref14 ref58 ref53 ref52 ref11 ref55 ref10 ref54 ref17 Kingma (ref59) ref16 ref19 ref18 ref51 ref50 ref46 ref45 ref48 ref47 ref42 ref41 ref44 ref43 ref49 Brown (ref12) ref8 ref7 ref9 ref4 ref3 ref6 ref5 ref40 ref35 ref37 ref31 ref30 ref33 ref32 ref2 ref1 Zhang (ref21) ref38 Shen (ref34) 2017 Lv (ref36) ref24 ref23 ref25 ref20 ref22 ref28 ref27 ref29 ref60 Wei (ref39) Ying (ref26) 2017 |
References_xml | – ident: ref6 doi: 10.1109/TIP.2009.2021548 – ident: ref47 doi: 10.1109/CVPR.2019.01129 – ident: ref49 doi: 10.1007/978-0-387-31439-6_482 – ident: ref60 doi: 10.1109/CVPR.2004.1315266 – ident: ref2 doi: 10.1109/ICCV.2019.00742 – ident: ref9 doi: 10.1016/j.patcog.2016.06.008 – ident: ref37 doi: 10.1109/TIP.2018.2794218 – ident: ref16 doi: 10.1109/VBC.1990.109340 – year: 2017 ident: ref34 article-title: MSR-Net: Low-light image enhancement using deep convolutional network publication-title: arXiv:1711.02488 – ident: ref40 doi: 10.1109/TIP.2021.3051462 – ident: ref52 doi: 10.1109/TIP.2016.2639450 – ident: ref1 doi: 10.1109/CVPR.2018.00347 – ident: ref48 doi: 10.1109/CVPR42600.2020.00277 – ident: ref42 doi: 10.1109/CVPR42600.2020.00313 – ident: ref11 doi: 10.1109/ICCV.2019.00328 – ident: ref23 doi: 10.1109/TVCG.2015.2461157 – ident: ref29 doi: 10.1109/TIP.2013.2261309 – ident: ref53 doi: 10.1109/TIP.2003.819861 – ident: ref45 doi: 10.1109/CVPR46437.2021.00904 – year: 2017 ident: ref26 article-title: A bio-inspired multi-exposure fusion framework for low-light image enhancement publication-title: arXiv:1711.00591 – ident: ref28 doi: 10.1109/SITIS.2013.19 – start-page: 2034 volume-title: Proc. 21st Int. Conf. Pattern Recognit. ident: ref21 article-title: Enhancement and noise reduction of very low light level images – ident: ref58 doi: 10.1109/CVPRW.2018.00121 – ident: ref5 doi: 10.1109/TCE.2007.4429280 – ident: ref57 doi: 10.1109/CVPR.2012.6247952 – start-page: 1 volume-title: Proc. Int. Conf. Learn. Represent. ident: ref59 article-title: Adam: A method for stochastic optimization – ident: ref7 doi: 10.1109/ICME.2011.6012107 – ident: ref27 doi: 10.1109/83.557356 – ident: ref44 doi: 10.1109/CVPR.2018.00193 – ident: ref19 doi: 10.1109/TIP.2013.2284059 – ident: ref31 doi: 10.1016/j.sigpro.2016.05.031 – ident: ref15 doi: 10.1609/aaai.v34i07.7013 – start-page: 1 volume-title: Proc. IEEE/CVF Int. Conf. Comput. Vis. Tutorial ident: ref12 article-title: Understanding color and the in-camera image processing pipeline for computer vision – ident: ref3 doi: 10.1109/TPAMI.2021.3070580 – ident: ref17 doi: 10.1109/TCE.2007.381734 – ident: ref25 doi: 10.1109/TCSVT.2017.2763180 – ident: ref43 doi: 10.1145/3072959.3073592 – ident: ref35 doi: 10.1109/VCIP.2017.8305143 – ident: ref46 doi: 10.1109/CVPR46437.2021.00016 – ident: ref30 doi: 10.1109/ICASSP.2014.6853785 – ident: ref54 doi: 10.1109/TIP.2005.859378 – ident: ref55 doi: 10.1109/LSP.2012.2227726 – ident: ref33 doi: 10.1109/ISCAS.2018.8351427 – ident: ref18 doi: 10.1109/TCSVT.2013.2276154 – ident: ref13 doi: 10.1109/TIP.2018.2872858 – ident: ref38 doi: 10.1109/CVPR.2019.00701 – ident: ref14 doi: 10.1109/CVPR42600.2020.00283 – ident: ref51 doi: 10.1007/978-3-319-24574-4_28 – ident: ref20 doi: 10.1109/ISPACS.2013.6704591 – ident: ref8 doi: 10.1109/83.597272 – ident: ref24 doi: 10.1109/TIP.2011.2157513 – ident: ref50 doi: 10.1109/TIP.2008.2001399 – ident: ref10 doi: 10.1145/3343031.3350926 – ident: ref41 doi: 10.1109/CVPR42600.2020.00185 – start-page: 220 volume-title: Proc. Brit. Mach. Vis. Conf. ident: ref36 article-title: MBLLEN: Low-light image/video enhancement using cnns – ident: ref56 doi: 10.1109/CVPR.2018.00068 – ident: ref32 doi: 10.1109/TIP.2018.2810539 – start-page: 155 volume-title: Proc. Brit. Mach. Vis. Conf. ident: ref39 article-title: Deep retinex decomposition for low-light enhancement – ident: ref4 doi: 10.1109/CVPR42600.2020.00235 – ident: ref22 doi: 10.1109/ICIP.2015.7351501 |
SSID | ssj0014516 |
Score | 2.5974097 |
Snippet | In this paper, we make the first benchmark effort to elaborate on the superiority of using RAW images in the low light enhancement and develop a novel... |
SourceID | proquest pubmed crossref ieee |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 1391 |
SubjectTerms | benchmark Benchmark testing Benchmarks deep learning factorized enhancement model Histograms Image enhancement Image processing Lighting Linearity Low-light enhancement Pipelines RAW guidance Training |
Title | Towards Low Light Enhancement With RAW Images |
URI | https://ieeexplore.ieee.org/document/9684237 https://www.ncbi.nlm.nih.gov/pubmed/35038292 https://www.proquest.com/docview/2622825820 https://www.proquest.com/docview/2621018215 |
Volume | 31 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lj9MwEB61PbEHFtpdCBRkJC5IpA87ietjhbpqURch1Kq9RbFjaxFsuqKpkPj1zDgPAWIRt0hxnMQzY8_n8XwD8NroWDmTRGhICQKUyGahEjMTUvEplWUTO3GU73z9IVluo_f7eN-Bt20ujLXWHz6zI7r0sfz8YE60VTZWFDQSsgtdBG5VrlYbMaCCsz6yGctQotvfhCQnarxZfUQgyDniU1q-qPibIBYUrvhvq5Evr3K_p-lXnKtzuG6-tTpo8mV0KvXI_PiDxvF_f-YRPKxdTzavdOUxdGzRh_PaDWW1kR_7cPYLR-EAwo0_WHtk68N3tiYozxbFDekK9c92n8sb9mm-Y6tbnJmOF7C9WmzeLcO6xkJoRCTLcJpbwY2euTjXjhObXmwlzpJaII5yeayiqZZE4u40l1KrmTBC6Cy26CdoYY24hF5xKOxTYAiz1UxHce4iEzk5zSQavHCcOyJ9z5MAxs1Yp6YmIKc6GF9TD0QmKkVBpSSotBZUAG_aJ-4q8o1_tB3QGLft6uENYNiIM62t85jyhFPKLjo_Abxqb6NdUbAkK-zh5NsQmRl6RAE8qdSg7bvRnmd_f-dzeEBfVm3UDKFXfjvZF-i6lPql19mfDpvj6Q |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV1Lb9QwEB6VcgAOLbQ8AgWMBAcO2c3aSbw-cKig1S7dVght1d5C7Nhq1TaLmqwq-C38Ff4bM85DgIBbJW6R4jzseXjGM_MNwEujE-VMGqMgpeigxDYPlRibkJpPqTyPbOSo3nn_IJ0cxu-Pk-MV-NbXwlhrffKZHdClj-UXC7Oko7KhoqCRkG0K5Z79coUOWvVm-g6p-Yrz3Z3520nY9hAIjYhlHY4KK7jRY5cU2nFCi0usRC2gBfoJrkhUPNKSQMqd5lJqNRZGCJ0nFvdBLawR-N4bcBPtjIQ31WF9jIJa3PpYaiJDiY5GFwSN1HA-_YCuJ-foEdOGSe3mBOGucMV_2f98Q5e_27Z-j9tdh-_d6jSpLWeDZa0H5utvwJH_6_LdhbXWuGbbjTTcgxVbbsB6a2izVo1VG3DnJxTGTQjnPnW4YrPFFZvRYQXbKU9IGmg-7Oi0PmEft4_Y9AJ1b3UfDq9lCg9gtVyU9hEwFUk11nFSuNjETo5yiSpNOM4dwdoXaQDDjraZaSHWqdPHeeZdrUhlyBgZMUbWMkYAr_snPjfwIv8Yu0k07ce15Axgq2OfrNU_VcZTTkXJaN4F8KK_jZqDwkF5aRdLP4bg2tDmC-Bhw3b9uztuffznbz6HW5P5_iybTQ_2nsBt-svmWGoLVuvLpX2Khlqtn3l5YfDpujnsB6hAQKI |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Towards+Low+Light+Enhancement+With+RAW+Images&rft.jtitle=IEEE+transactions+on+image+processing&rft.au=Huang%2C+Haofeng&rft.au=Yang%2C+Wenhan&rft.au=Hu%2C+Yueyu&rft.au=Liu%2C+Jiaying&rft.date=2022&rft.eissn=1941-0042&rft.volume=31&rft.spage=1391&rft_id=info:doi/10.1109%2FTIP.2022.3140610&rft_id=info%3Apmid%2F35038292&rft.externalDocID=35038292 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1057-7149&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1057-7149&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1057-7149&client=summon |