A multimodal approach to estimating vigilance using EEG and forehead EOG
Objective. Covert aspects of ongoing user mental states provide key context information for user-aware human computer interactions. In this paper, we focus on the problem of estimating the vigilance of users using EEG and EOG signals. Approach. The PERCLOS index as vigilance annotation is obtained f...
Saved in:
Published in | Journal of neural engineering Vol. 14; no. 2; pp. 26017 - 26030 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
England
IOP Publishing
01.04.2017
|
Subjects | |
Online Access | Get full text |
ISSN | 1741-2560 1741-2552 1741-2552 |
DOI | 10.1088/1741-2552/aa5a98 |
Cover
Loading…
Abstract | Objective. Covert aspects of ongoing user mental states provide key context information for user-aware human computer interactions. In this paper, we focus on the problem of estimating the vigilance of users using EEG and EOG signals. Approach. The PERCLOS index as vigilance annotation is obtained from eye tracking glasses. To improve the feasibility and wearability of vigilance estimation devices for real-world applications, we adopt a novel electrode placement for forehead EOG and extract various eye movement features, which contain the principal information of traditional EOG. We explore the effects of EEG from different brain areas and combine EEG and forehead EOG to leverage their complementary characteristics for vigilance estimation. Considering that the vigilance of users is a dynamic changing process because the intrinsic mental states of users involve temporal evolution, we introduce continuous conditional neural field and continuous conditional random field models to capture dynamic temporal dependency. Main results. We propose a multimodal approach to estimating vigilance by combining EEG and forehead EOG and incorporating the temporal dependency of vigilance into model training. The experimental results demonstrate that modality fusion can improve the performance compared with a single modality, EOG and EEG contain complementary information for vigilance estimation, and the temporal dependency-based models can enhance the performance of vigilance estimation. From the experimental results, we observe that theta and alpha frequency activities are increased, while gamma frequency activities are decreased in drowsy states in contrast to awake states. Significance. The forehead setup allows for the simultaneous collection of EEG and EOG and achieves comparative performance using only four shared electrodes in comparison with the temporal and posterior sites. |
---|---|
AbstractList | Covert aspects of ongoing user mental states provide key context information for user-aware human computer interactions. In this paper, we focus on the problem of estimating the vigilance of users using EEG and EOG signals.
The PERCLOS index as vigilance annotation is obtained from eye tracking glasses. To improve the feasibility and wearability of vigilance estimation devices for real-world applications, we adopt a novel electrode placement for forehead EOG and extract various eye movement features, which contain the principal information of traditional EOG. We explore the effects of EEG from different brain areas and combine EEG and forehead EOG to leverage their complementary characteristics for vigilance estimation. Considering that the vigilance of users is a dynamic changing process because the intrinsic mental states of users involve temporal evolution, we introduce continuous conditional neural field and continuous conditional random field models to capture dynamic temporal dependency.
We propose a multimodal approach to estimating vigilance by combining EEG and forehead EOG and incorporating the temporal dependency of vigilance into model training. The experimental results demonstrate that modality fusion can improve the performance compared with a single modality, EOG and EEG contain complementary information for vigilance estimation, and the temporal dependency-based models can enhance the performance of vigilance estimation. From the experimental results, we observe that theta and alpha frequency activities are increased, while gamma frequency activities are decreased in drowsy states in contrast to awake states.
The forehead setup allows for the simultaneous collection of EEG and EOG and achieves comparative performance using only four shared electrodes in comparison with the temporal and posterior sites. Objective. Covert aspects of ongoing user mental states provide key context information for user-aware human computer interactions. In this paper, we focus on the problem of estimating the vigilance of users using EEG and EOG signals. Approach. The PERCLOS index as vigilance annotation is obtained from eye tracking glasses. To improve the feasibility and wearability of vigilance estimation devices for real-world applications, we adopt a novel electrode placement for forehead EOG and extract various eye movement features, which contain the principal information of traditional EOG. We explore the effects of EEG from different brain areas and combine EEG and forehead EOG to leverage their complementary characteristics for vigilance estimation. Considering that the vigilance of users is a dynamic changing process because the intrinsic mental states of users involve temporal evolution, we introduce continuous conditional neural field and continuous conditional random field models to capture dynamic temporal dependency. Main results. We propose a multimodal approach to estimating vigilance by combining EEG and forehead EOG and incorporating the temporal dependency of vigilance into model training. The experimental results demonstrate that modality fusion can improve the performance compared with a single modality, EOG and EEG contain complementary information for vigilance estimation, and the temporal dependency-based models can enhance the performance of vigilance estimation. From the experimental results, we observe that theta and alpha frequency activities are increased, while gamma frequency activities are decreased in drowsy states in contrast to awake states. Significance. The forehead setup allows for the simultaneous collection of EEG and EOG and achieves comparative performance using only four shared electrodes in comparison with the temporal and posterior sites. Covert aspects of ongoing user mental states provide key context information for user-aware human computer interactions. In this paper, we focus on the problem of estimating the vigilance of users using EEG and EOG signals.OBJECTIVECovert aspects of ongoing user mental states provide key context information for user-aware human computer interactions. In this paper, we focus on the problem of estimating the vigilance of users using EEG and EOG signals.The PERCLOS index as vigilance annotation is obtained from eye tracking glasses. To improve the feasibility and wearability of vigilance estimation devices for real-world applications, we adopt a novel electrode placement for forehead EOG and extract various eye movement features, which contain the principal information of traditional EOG. We explore the effects of EEG from different brain areas and combine EEG and forehead EOG to leverage their complementary characteristics for vigilance estimation. Considering that the vigilance of users is a dynamic changing process because the intrinsic mental states of users involve temporal evolution, we introduce continuous conditional neural field and continuous conditional random field models to capture dynamic temporal dependency.APPROACHThe PERCLOS index as vigilance annotation is obtained from eye tracking glasses. To improve the feasibility and wearability of vigilance estimation devices for real-world applications, we adopt a novel electrode placement for forehead EOG and extract various eye movement features, which contain the principal information of traditional EOG. We explore the effects of EEG from different brain areas and combine EEG and forehead EOG to leverage their complementary characteristics for vigilance estimation. Considering that the vigilance of users is a dynamic changing process because the intrinsic mental states of users involve temporal evolution, we introduce continuous conditional neural field and continuous conditional random field models to capture dynamic temporal dependency.We propose a multimodal approach to estimating vigilance by combining EEG and forehead EOG and incorporating the temporal dependency of vigilance into model training. The experimental results demonstrate that modality fusion can improve the performance compared with a single modality, EOG and EEG contain complementary information for vigilance estimation, and the temporal dependency-based models can enhance the performance of vigilance estimation. From the experimental results, we observe that theta and alpha frequency activities are increased, while gamma frequency activities are decreased in drowsy states in contrast to awake states.MAIN RESULTSWe propose a multimodal approach to estimating vigilance by combining EEG and forehead EOG and incorporating the temporal dependency of vigilance into model training. The experimental results demonstrate that modality fusion can improve the performance compared with a single modality, EOG and EEG contain complementary information for vigilance estimation, and the temporal dependency-based models can enhance the performance of vigilance estimation. From the experimental results, we observe that theta and alpha frequency activities are increased, while gamma frequency activities are decreased in drowsy states in contrast to awake states.The forehead setup allows for the simultaneous collection of EEG and EOG and achieves comparative performance using only four shared electrodes in comparison with the temporal and posterior sites.SIGNIFICANCEThe forehead setup allows for the simultaneous collection of EEG and EOG and achieves comparative performance using only four shared electrodes in comparison with the temporal and posterior sites. |
Author | Lu, Bao-Liang Zheng, Wei-Long |
Author_xml | – sequence: 1 givenname: Wei-Long surname: Zheng fullname: Zheng, Wei-Long email: weilong@sjtu.edu.cn organization: Shanghai Jiao Tong University Center for Brain-like Computing and Machine Intelligence, Department of Computer Science and Engineering, Shanghai, People's Republic of China – sequence: 2 givenname: Bao-Liang surname: Lu fullname: Lu, Bao-Liang email: bllu@sjtu.edu.cn organization: Shanghai Jiao Tong University Brain Science and Technology Research Center, Shanghai, People's Republic of China |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/28102833$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kL1PwzAQxS1URD9gZ0LeYKDUl8SOM1ZVaJEqdYHZchynTZXEIU6Q-O9xlNIBoU6-O_2e9d6bolFlKo3QPZAXIJwvIAxg7lHqLaSkMuJXaHI-jc4zI2M0tfZIiA9hRG7Q2ONAPO77E7RZ4rIr2rw0qSywrOvGSHXArcHauqts82qPv_J9XshKadzZfo_jNZZVijPT6IOWKY5361t0ncnC6rvTO0Mfr_H7ajPf7tZvq-V2rnzG23kUQJSmkVSUkIwENOCScu5RjzCAJILUbZnmPg1loiLOslAFLkACjIR-JMGfoafhX-f0s3MmRZlbpQvnT5vOCuAMaMiBMoc-nNAuKXUq6sYFar7Fb3oHkAFQjbG20dkZASL6gkXfoOjbFEPBTsL-SFTeupZM1TYyLy4JnwdhbmpxNF1TuZYu4Y__4MdKCwiEJ4jHCISiTjP_B7arl08 |
CODEN | JNEIEZ |
CitedBy_id | crossref_primary_10_3390_s22134914 crossref_primary_10_1057_s41599_024_03691_1 crossref_primary_10_34172_jrcm_34638 crossref_primary_10_3390_s20051340 crossref_primary_10_3390_e23101316 crossref_primary_10_1007_s11571_024_10105_0 crossref_primary_10_32604_cmc_2022_022553 crossref_primary_10_1016_j_compbiomed_2021_104696 crossref_primary_10_1109_TIM_2023_3307756 crossref_primary_10_1109_JSEN_2024_3494745 crossref_primary_10_1155_2017_2107451 crossref_primary_10_1016_j_engappai_2022_105309 crossref_primary_10_1088_1741_2552_ad1ac1 crossref_primary_10_1109_JBHI_2024_3519730 crossref_primary_10_3390_s22031100 crossref_primary_10_3390_s22052069 crossref_primary_10_1016_j_bmt_2024_10_003 crossref_primary_10_1109_JBHI_2021_3096984 crossref_primary_10_1109_JSEN_2021_3058658 crossref_primary_10_1109_TBCAS_2021_3060617 crossref_primary_10_1016_j_inffus_2024_102697 crossref_primary_10_1109_TITS_2024_3442249 crossref_primary_10_1016_j_bspc_2020_102075 crossref_primary_10_1109_TCYB_2019_2940509 crossref_primary_10_1016_j_neucom_2024_128961 crossref_primary_10_1109_ACCESS_2024_3460393 crossref_primary_10_1016_j_compbiolchem_2023_107863 crossref_primary_10_3390_s18072074 crossref_primary_10_1109_JBHI_2024_3377373 crossref_primary_10_1109_TITS_2022_3189346 crossref_primary_10_3390_electronics13101798 crossref_primary_10_1016_j_engappai_2023_106237 crossref_primary_10_1063_5_0133092 crossref_primary_10_1016_j_dsp_2024_104856 crossref_primary_10_1109_TIM_2021_3094619 crossref_primary_10_1109_JSEN_2021_3077021 crossref_primary_10_1109_TCYB_2020_3022647 crossref_primary_10_1016_j_bspc_2024_106685 crossref_primary_10_1145_3614442 crossref_primary_10_1016_j_neunet_2024_106767 crossref_primary_10_1088_1741_2552_acf345 crossref_primary_10_3389_fphys_2023_1196919 crossref_primary_10_1109_ACCESS_2023_3296382 crossref_primary_10_1016_j_eswa_2023_120177 crossref_primary_10_3390_s22207824 crossref_primary_10_1007_s11517_024_03033_y crossref_primary_10_1038_s41597_024_03729_8 crossref_primary_10_1109_TIM_2022_3214265 crossref_primary_10_1063_5_0056139 crossref_primary_10_1016_j_ifacol_2021_04_188 crossref_primary_10_1109_JSEN_2023_3303441 crossref_primary_10_1007_s13755_024_00271_0 crossref_primary_10_1016_j_engappai_2023_107660 crossref_primary_10_1049_sil2_12076 crossref_primary_10_1016_j_ipm_2023_103614 crossref_primary_10_1007_s00521_022_07466_0 crossref_primary_10_1109_ACCESS_2022_3185251 crossref_primary_10_1016_j_neucom_2023_126709 crossref_primary_10_1109_TCDS_2018_2889223 crossref_primary_10_1088_2516_1091_ad8530 crossref_primary_10_1109_TIM_2024_3451583 crossref_primary_10_1109_ACCESS_2020_3009226 crossref_primary_10_1155_2020_8875426 crossref_primary_10_1109_ACCESS_2022_3151372 crossref_primary_10_3390_brainsci13040638 crossref_primary_10_3390_s23052455 crossref_primary_10_4108_eetpht_8_3432 crossref_primary_10_1049_ell2_12275 crossref_primary_10_1109_ACCESS_2025_3545094 crossref_primary_10_1109_TNSRE_2021_3089594 crossref_primary_10_1007_s00530_021_00786_6 crossref_primary_10_1007_s00521_023_08343_0 crossref_primary_10_1109_TCSII_2022_3211931 crossref_primary_10_3389_fnins_2022_985709 crossref_primary_10_1007_s11042_023_15054_0 crossref_primary_10_1109_JBHI_2024_3446952 crossref_primary_10_1109_ACCESS_2021_3068343 crossref_primary_10_3390_s20247083 crossref_primary_10_1016_j_compbiomed_2021_104350 crossref_primary_10_1016_j_jneumeth_2023_109983 crossref_primary_10_3389_fnrgo_2020_606719 crossref_primary_10_1109_TCSVT_2021_3061719 crossref_primary_10_1016_j_neucom_2019_05_108 crossref_primary_10_1016_j_bspc_2023_105411 crossref_primary_10_3390_brainsci14050436 crossref_primary_10_1080_03772063_2021_1913070 crossref_primary_10_1109_ACCESS_2019_2915533 crossref_primary_10_1016_j_aei_2021_101359 crossref_primary_10_1016_j_knosys_2021_108047 crossref_primary_10_3390_s21010056 crossref_primary_10_1109_TETCI_2018_2848289 crossref_primary_10_1109_TIM_2022_3216409 crossref_primary_10_1109_TCDS_2021_3061564 crossref_primary_10_1109_TCDS_2021_3082803 crossref_primary_10_3389_fnins_2023_1275065 crossref_primary_10_1109_TITS_2021_3120435 crossref_primary_10_1109_JBHI_2023_3285268 crossref_primary_10_1088_1741_2552_ad546d crossref_primary_10_1016_j_artmed_2024_102996 crossref_primary_10_3390_app9235004 crossref_primary_10_1016_j_bspc_2024_107262 crossref_primary_10_1007_s11633_024_1492_6 crossref_primary_10_1016_j_bspc_2023_105488 crossref_primary_10_3389_fnins_2021_738167 crossref_primary_10_1016_j_bspc_2023_104831 crossref_primary_10_1016_j_eswa_2019_02_005 crossref_primary_10_1155_2024_9898333 crossref_primary_10_1007_s13246_021_01020_3 crossref_primary_10_1109_TNSRE_2022_3175464 crossref_primary_10_7717_peerj_15744 crossref_primary_10_1007_s12559_023_10233_5 crossref_primary_10_1016_j_measurement_2024_115940 crossref_primary_10_1109_TCDS_2020_3007453 crossref_primary_10_3390_s20164551 crossref_primary_10_1007_s10044_019_00860_w crossref_primary_10_1109_MCI_2021_3061875 crossref_primary_10_3390_bioengineering10010054 crossref_primary_10_1016_j_jneumeth_2025_110385 crossref_primary_10_3389_fncom_2019_00053 crossref_primary_10_1088_2057_1976_ad0f3f crossref_primary_10_1109_ACCESS_2018_2811723 crossref_primary_10_1109_TCDS_2019_2963073 crossref_primary_10_26599_BSA_2020_9050026 crossref_primary_10_1109_JSEN_2024_3406047 crossref_primary_10_1515_bmt_2022_0354 crossref_primary_10_1002_jsid_667 crossref_primary_10_1007_s11571_021_09714_w crossref_primary_10_1016_j_aei_2024_102575 crossref_primary_10_3390_s21175746 crossref_primary_10_1109_TBME_2023_3328942 crossref_primary_10_1109_TETCI_2023_3332549 crossref_primary_10_3934_Neuroscience_2024006 crossref_primary_10_3389_fnins_2023_1136609 crossref_primary_10_1016_j_bspc_2024_107132 crossref_primary_10_1109_THMS_2019_2938156 crossref_primary_10_1145_3654664 crossref_primary_10_1109_JSEN_2023_3273556 crossref_primary_10_1016_j_cmpb_2023_107773 crossref_primary_10_1007_s11831_023_09920_1 crossref_primary_10_1109_JSEN_2018_2883497 crossref_primary_10_3390_s21041255 crossref_primary_10_1109_TITS_2018_2889962 crossref_primary_10_3390_mti6060047 crossref_primary_10_1038_s41598_025_86234_1 crossref_primary_10_1038_s41598_023_43477_0 crossref_primary_10_1016_j_bspc_2024_106770 crossref_primary_10_3390_computers13100246 crossref_primary_10_1109_TAFFC_2020_2981440 crossref_primary_10_1016_j_dsp_2022_103433 crossref_primary_10_1016_j_neucom_2020_12_026 crossref_primary_10_1016_j_bspc_2023_105045 crossref_primary_10_1177_1071181322661512 crossref_primary_10_1109_JPROC_2023_3277471 crossref_primary_10_1063_1_5120538 crossref_primary_10_1016_j_bspc_2023_104744 crossref_primary_10_1016_j_compbiomed_2022_106431 crossref_primary_10_1016_j_neunet_2024_106617 crossref_primary_10_1109_TCYB_2021_3123842 crossref_primary_10_3389_fnbot_2021_796895 crossref_primary_10_3390_s21103439 crossref_primary_10_1016_j_chaos_2021_110671 crossref_primary_10_1109_TNNLS_2022_3190448 crossref_primary_10_1007_s11517_025_03295_0 crossref_primary_10_1109_TNNLS_2018_2886414 crossref_primary_10_1109_JBHI_2020_2980056 crossref_primary_10_1155_2021_7799793 crossref_primary_10_1016_j_engappai_2024_109825 crossref_primary_10_1016_j_engappai_2024_109153 crossref_primary_10_1021_acsnano_1c00536 crossref_primary_10_1016_j_jneumeth_2020_108927 crossref_primary_10_1088_1741_2552_aca1e2 crossref_primary_10_1109_JSEN_2022_3228534 crossref_primary_10_1088_1741_2552_ac7ba8 crossref_primary_10_1109_JSEN_2022_3177931 crossref_primary_10_31083_j_jin2206146 crossref_primary_10_1038_s41598_024_59469_7 crossref_primary_10_1109_TBME_2024_3468351 crossref_primary_10_1109_ACCESS_2020_3004504 crossref_primary_10_1016_j_neucom_2024_128920 crossref_primary_10_3390_s20205881 crossref_primary_10_1007_s00521_024_10282_3 crossref_primary_10_1007_s10055_024_01067_z crossref_primary_10_1038_s41551_019_0347_x crossref_primary_10_1016_j_asoc_2022_108982 crossref_primary_10_1155_2019_4721863 crossref_primary_10_1016_j_measurement_2019_107003 crossref_primary_10_1016_j_jksuci_2021_03_009 crossref_primary_10_1016_j_patrec_2020_11_019 crossref_primary_10_1016_j_patrec_2023_05_011 crossref_primary_10_3390_informatics9010026 crossref_primary_10_1016_j_bbr_2024_114898 crossref_primary_10_3390_life15040536 crossref_primary_10_1007_s11571_022_09898_9 crossref_primary_10_3390_electronics13112084 crossref_primary_10_1007_s11571_025_10224_2 |
Cites_doi | 10.1109/TNSRE.2015.2415520 10.1016/S0167-8760(00)00088-X 10.1088/1741-2560/11/4/046018 10.1007/978-3-319-26561-2_8 10.1016/j.eswa.2007.12.043 10.1109/TBME.2010.2077291 10.1088/1741-2560/12/3/031001 10.1016/j.patrec.2015.06.013 10.1088/1741-2560/8/2/025005 10.1088/1741-2560/9/1/016003 10.1016/j.jneumeth.2010.05.008 10.1109/TBME.2015.2481482 10.1016/j.jneumeth.2003.10.009 10.1109/TITS.2008.928241 10.1109/T-AFFC.2011.9 10.1109/TBME.2007.893452 10.1016/j.neucom.2012.02.041 10.1109/TNSRE.2013.2294685 10.1016/j.clinph.2007.04.031 10.1109/TNSRE.2014.2346621 10.1088/1741-2560/8/1/016003 10.1038/nn.4179 10.1088/1741-2560/10/5/056024 10.1016/j.clinph.2007.06.004 10.1109/FG.2013.6553785 10.1518/hfes.45.3.349.27253 10.1073/pnas.1516947113 10.1053/smrv.2000.0138 10.2174/2213385202666141218104855 10.1109/TITS.2010.2092770 10.1088/1741-2560/12/4/046027 10.1038/nrn1931 10.1088/1741-2560/11/3/035009 10.1016/0013-4694(93)90064-3 10.1109/T-AFFC.2010.12 10.3390/s121216937 10.1088/1741-2560/12/1/016001 10.1088/1741-2560/11/5/056011 10.1523/JNEUROSCI.5967-08.2009 10.1109/TITS.2006.869598 10.1007/978-3-642-02812-0_47 10.1126/science.3992243 10.1109/TAMD.2015.2431497 10.1109/TVT.2004.830974 10.1109/T-AFFC.2010.1 10.1088/1741-2560/8/2/025008 10.1016/j.neuroimage.2010.04.250 10.1016/j.neuroimage.2015.02.015 10.1109/TKDE.2009.191 10.1109/TBCAS.2014.2316224 10.1109/TPAMI.2010.86 |
ContentType | Journal Article |
Copyright | 2017 IOP Publishing Ltd |
Copyright_xml | – notice: 2017 IOP Publishing Ltd |
DBID | AAYXX CITATION CGR CUY CVF ECM EIF NPM 7X8 |
DOI | 10.1088/1741-2552/aa5a98 |
DatabaseName | CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed MEDLINE - Academic |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) MEDLINE - Academic |
DatabaseTitleList | MEDLINE MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Anatomy & Physiology |
DocumentTitleAlternate | A multimodal approach to estimating vigilance using EEG and forehead EOG |
EISSN | 1741-2552 |
ExternalDocumentID | 28102833 10_1088_1741_2552_aa5a98 jneaa5a98 |
Genre | Evaluation Studies Research Support, Non-U.S. Gov't Journal Article |
GrantInformation_xml | – fundername: Major Basic Research Program of Shanghai Science and Technology Committee grantid: 15JC1400103 – fundername: Technology Research and Development Program of China Railway Corporation grantid: 2016Z003-B – fundername: National Basic Research Program of China grantid: 2013CB329401 – fundername: National Natural Science Foundation of China grantid: 61272248; 61673266 funderid: https://doi.org/10.13039/501100001809 |
GroupedDBID | --- 1JI 4.4 53G 5B3 5GY 5VS 5ZH 7.M 7.Q AAGCD AAJIO AAJKP AALHV AATNI ABHWH ABJNI ABQJV ABVAM ACAFW ACGFS ACHIP AEFHF AENEX AFYNE AKPSB ALMA_UNASSIGNED_HOLDINGS AOAED ASPBG ATQHT AVWKF AZFZN CEBXE CJUJL CRLBU CS3 DU5 EBS EDWGO EJD EMSAF EPQRW EQZZN F5P HAK IHE IJHAN IOP IZVLO KOT LAP M45 N5L N9A NT- NT. P2P PJBAE RIN RO9 ROL RPA SY9 W28 XPP AAYXX ADEQX CITATION 02O 1WK AERVB AHSEE ARNYC BBWZM CGR CUY CVF ECM EIF FEDTE HVGLF JCGBZ NPM Q02 RNS S3P 7X8 AEINN |
ID | FETCH-LOGICAL-c368t-9419dd9ac500f04548a5882520611b91d588fe8357abc986f7c4560b160739a13 |
IEDL.DBID | IOP |
ISSN | 1741-2560 1741-2552 |
IngestDate | Thu Sep 04 21:24:50 EDT 2025 Wed Feb 19 02:42:28 EST 2025 Tue Jul 01 01:58:37 EDT 2025 Thu Apr 24 22:51:24 EDT 2025 Fri Jan 08 09:41:23 EST 2021 Wed Aug 21 03:33:55 EDT 2024 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 2 |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c368t-9419dd9ac500f04548a5882520611b91d588fe8357abc986f7c4560b160739a13 |
Notes | JNE-101438.R1 ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Undefined-1 ObjectType-Feature-3 content type line 23 |
PMID | 28102833 |
PQID | 1861578156 |
PQPubID | 23479 |
PageCount | 14 |
ParticipantIDs | crossref_citationtrail_10_1088_1741_2552_aa5a98 proquest_miscellaneous_1861578156 pubmed_primary_28102833 iop_journals_10_1088_1741_2552_aa5a98 crossref_primary_10_1088_1741_2552_aa5a98 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2017-04-01 |
PublicationDateYYYYMMDD | 2017-04-01 |
PublicationDate_xml | – month: 04 year: 2017 text: 2017-04-01 day: 01 |
PublicationDecade | 2010 |
PublicationPlace | England |
PublicationPlace_xml | – name: England |
PublicationTitle | Journal of neural engineering |
PublicationTitleAbbrev | JNE |
PublicationTitleAlternate | J. Neural Eng |
PublicationYear | 2017 |
Publisher | IOP Publishing |
Publisher_xml | – name: IOP Publishing |
References | Huang R S (22) 2009 44 45 46 47 Shi L C (57) 2013 48 49 Trutschel U (61) 2011; 2011 D’mello S K (12) 2015; 47 50 10 54 55 56 13 15 59 17 18 19 Shi L C (58) 2010 3 Baltrušaitis T (2) 2014 5 Baltrusaitis T (1) 2013 6 7 8 Berka C (4) 2007; 78 9 60 62 63 Zhang Y Q (69) 2015 20 64 Dinges D F (11) 1998 21 65 Pfurtscheller G (52) 2010; 4 66 67 25 26 27 Ma J X (39) 2010 28 29 Lu Y (36) 2015 Imbrasaite V (24) 2014 Gao X Y (16) 2015 Zheng W L (72) 2016 71 30 32 33 34 Peng J (51) 2009 35 Mühl C (43) 2014; 8 Zhang Y F (68) 2015 37 38 Ranney T A (53) 2008 Lafferty J (31) 2001 Duan R N (14) 2013 Zheng W L (70) 2014 40 41 42 Huo X Q (23) 2016 |
References_xml | – ident: 63 doi: 10.1109/TNSRE.2015.2415520 – ident: 27 doi: 10.1016/S0167-8760(00)00088-X – ident: 19 doi: 10.1088/1741-2560/11/4/046018 – start-page: 61 year: 2015 ident: 69 publication-title: Int. Conf. on Neural Information Processing doi: 10.1007/978-3-319-26561-2_8 – ident: 25 doi: 10.1016/j.eswa.2007.12.043 – start-page: 1170 year: 2015 ident: 36 publication-title: Int. Joint Conf. on Artificial Intelligence – volume: 2011 start-page: 6th year: 2011 ident: 61 publication-title: Driving Assess. – start-page: 5040 year: 2014 ident: 70 publication-title: 36th Annual Int. Conf. of the IEEE Eng. in Medicine and Biology Society – ident: 29 doi: 10.1109/TBME.2010.2077291 – ident: 62 doi: 10.1088/1741-2560/12/3/031001 – ident: 28 doi: 10.1016/j.patrec.2015.06.013 – ident: 67 doi: 10.1088/1741-2560/8/2/025005 – ident: 66 doi: 10.1088/1741-2560/9/1/016003 – volume: 47 start-page: 43 year: 2015 ident: 12 publication-title: ACM Comput. Surv. – ident: 32 doi: 10.1016/j.jneumeth.2010.05.008 – start-page: 1419 year: 2009 ident: 51 publication-title: Advances in Neural Information Processing Systems – start-page: 6587 year: 2010 ident: 58 publication-title: Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society – start-page: 767 year: 2015 ident: 16 publication-title: 7th Int. IEEE/EMBS Conf. on Neural Engineering – ident: 44 doi: 10.1109/TBME.2015.2481482 – ident: 10 doi: 10.1016/j.jneumeth.2003.10.009 – ident: 8 doi: 10.1109/TITS.2008.928241 – ident: 45 doi: 10.1109/T-AFFC.2011.9 – ident: 9 doi: 10.1109/TBME.2007.893452 – ident: 59 doi: 10.1016/j.neucom.2012.02.041 – ident: 41 doi: 10.1109/TNSRE.2013.2294685 – ident: 49 doi: 10.1016/j.clinph.2007.04.031 – ident: 7 doi: 10.1109/TNSRE.2014.2346621 – ident: 50 doi: 10.1088/1741-2560/8/1/016003 – ident: 55 doi: 10.1038/nn.4179 – year: 1998 ident: 11 publication-title: US Department of Transportation, Federal Highway Administration, Publication Number FHWA-MCRT-98-006 – ident: 35 doi: 10.1088/1741-2560/10/5/056024 – ident: 47 doi: 10.1016/j.clinph.2007.06.004 – start-page: 1 year: 2013 ident: 1 publication-title: 10th IEEE Int. Conf. and Workshops on Automatic Face and Gesture Recognition doi: 10.1109/FG.2013.6553785 – start-page: 81 year: 2013 ident: 14 publication-title: 6th Int. IEEE/EMBS Conf. on Neural Engineering – start-page: 6591 year: 2010 ident: 39 publication-title: Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society – ident: 17 doi: 10.1518/hfes.45.3.349.27253 – ident: 60 doi: 10.1073/pnas.1516947113 – ident: 15 doi: 10.1053/smrv.2000.0138 – ident: 40 doi: 10.2174/2213385202666141218104855 – ident: 13 doi: 10.1109/TITS.2010.2092770 – ident: 64 doi: 10.1088/1741-2560/12/4/046027 – ident: 21 doi: 10.1038/nrn1931 – ident: 38 doi: 10.1088/1741-2560/11/3/035009 – start-page: 2732 year: 2016 ident: 72 publication-title: Int. Joint Conf. on Artificial Intelligence – start-page: 6627 year: 2013 ident: 57 publication-title: 35th Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society – start-page: 897 year: 2016 ident: 23 publication-title: Int. Joint Conf. on Neural Networks – ident: 37 doi: 10.1016/0013-4694(93)90064-3 – ident: 65 doi: 10.1109/T-AFFC.2010.12 – ident: 56 doi: 10.3390/s121216937 – volume: 78 start-page: B231 year: 2007 ident: 4 publication-title: Aviat. Space Environ. Med. – ident: 30 doi: 10.1088/1741-2560/12/1/016001 – ident: 20 doi: 10.1088/1741-2560/11/5/056011 – volume: 8 year: 2014 ident: 43 publication-title: Frontiers Neurosci. – ident: 46 doi: 10.1523/JNEUROSCI.5967-08.2009 – ident: 3 doi: 10.1109/TITS.2006.869598 – start-page: 707 year: 2015 ident: 68 publication-title: 7th Int. IEEE/EMBS Conf. on Neural Engineering – start-page: 394 year: 2009 ident: 22 publication-title: Foundations of Augmented Cognition, Neuroergonomics and Operational Neuroscience doi: 10.1007/978-3-642-02812-0_47 – ident: 54 doi: 10.1126/science.3992243 – volume: 4 start-page: 3 year: 2010 ident: 52 publication-title: Frontiers Neurosci. – ident: 71 doi: 10.1109/TAMD.2015.2431497 – start-page: 1 year: 2014 ident: 24 publication-title: IEEE Int. Conf. on Multimedia and Expo Workshops – ident: 26 doi: 10.1109/TVT.2004.830974 – year: 2008 ident: 53 – start-page: 282 year: 2001 ident: 31 publication-title: 18th Int. Conf. on Machine Learning – ident: 6 doi: 10.1109/T-AFFC.2010.1 – ident: 18 doi: 10.1088/1741-2560/8/2/025008 – ident: 34 doi: 10.1016/j.neuroimage.2010.04.250 – ident: 42 doi: 10.1016/j.neuroimage.2015.02.015 – ident: 48 doi: 10.1109/TKDE.2009.191 – ident: 33 doi: 10.1109/TBCAS.2014.2316224 – start-page: 593 year: 2014 ident: 2 publication-title: Computer Vision–ECCV – ident: 5 doi: 10.1109/TPAMI.2010.86 |
SSID | ssj0031790 |
Score | 2.6088967 |
Snippet | Objective. Covert aspects of ongoing user mental states provide key context information for user-aware human computer interactions. In this paper, we focus on... Covert aspects of ongoing user mental states provide key context information for user-aware human computer interactions. In this paper, we focus on the problem... |
SourceID | proquest pubmed crossref iop |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 26017 |
SubjectTerms | Adult Algorithms Arousal - physiology Brain Waves - physiology brain-computer interfaces EEG Electroencephalography - methods Electrooculography - methods EOG Female Humans Male multimodal approach Pattern Recognition, Automated - methods Psychomotor Performance - physiology Reproducibility of Results Sensitivity and Specificity temporal dependency vigilance estimation |
Title | A multimodal approach to estimating vigilance using EEG and forehead EOG |
URI | https://iopscience.iop.org/article/10.1088/1741-2552/aa5a98 https://www.ncbi.nlm.nih.gov/pubmed/28102833 https://www.proquest.com/docview/1861578156 |
Volume | 14 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Nb9QwEB215cIFCgW6fMlIgMQhu0mcxLY4rdC2WyQoByr1gGTZsbMqtMkKskjtr--Mk6xURKuqt0QaO854Yj_Hz28A3qqcW1-VPMpy4SOcj1UkM6ciExuSB-OCh9yAX74W86Ps83F-vAEf12dhmmU_9I_xshMK7lzYE-LkBDF0EiESTifG5EbJTbjHZVFQ3oaDw2_DMMxJeqo7DUnWRdzvUf6vhitz0iY-93q4GaadvYfwY2hwxzb5NV61dlxe_KPleMc32oYHPRxl0870EWz4-jHsTGtcip-ds_csEETDn_cdmE9ZoB-eNQ6LDGLkrG0YKXUQ8q0X7O_J4uSUIokRo37BZrN9ZmrHEBt7HPcdmx3uP4Gjvdn3T_OoT8QQlbyQbaSyRDmnTJnHcUWafdLkiMzzFMFAYlXi8K7yiOWEsaWSRSVKxGWxJfE6rkzCn8JW3dR-Fxh3pRJYE1FAs6pSNjcVF7hOS60sbCpGMBm6Qpe9SjklyzjVYbdcSk3O0uQs3TlrBB_WJZadQscNtu-wD3T_mf65wY5dsftZe1wj6VQHBTahl64awZshRDR-kbTNYmrfrLBSiShRkArPCJ51sbNuWCoDoOPPb9mQF3A_JRwRqEIvYav9vfKvEAW19nWI9ktBO_nX |
linkProvider | IOP Publishing |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1Nb9QwEB31Q0JcgFIoW6AYCZA4ZDeJ49g-rmC32xbaHqjUm7FjZ1Vok1XJIsGvZ-wklVrRColbIo0dZzyxn-PnNwBvJKPGlQWNMsZdhPOxjERmZaRj7eXBKKchN-Dnw3x2ku2fstMuz2k4C1MvuqF_iJetUHDrwo4QJ0aIoZMIkXA60pppKUYLW67COqM59SkM9o6O-6GYevmp9kSkL5HH3T7l32q5Ni-t4rNvh5xh6pk-hK99o1vGyffhsjHD4vcNPcf_eKtH8KCDpWTcmm_Aiqsew-a4wiX5xS_yjgSiaPgDvwmzMQk0xIvaYpFelJw0NfGKHR4BV3Py82x-du4jinhm_ZxMJrtEV5YgRnY4_lsyOdp9AifTyZcPs6hLyBAVNBdNJLNEWit1weK49Np9QjNE6CxFUJAYmVi8Kx1iOq5NIUVe8gLxWWy8iB2VOqFPYa2qK_cMCLWF5FiTp4JmZSkN0yXluF5LjchNygcw6rtDFZ1auU-aca7CrrkQyjtMeYep1mEDeH9VYtEqddxh-xb7QXWf64877Mg1u2-Vw7WSSlVQYuMK-2gAr_swUfhl-u0WXbl6iZUKRIvcq_EMYKuNn6uGpSIAO7r9jw15BfeOP07Vp73Dg-dwP_XQIrCHXsBac7l0LxEYNWYnBP8fXsb_Ow |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+multimodal+approach+to+estimating+vigilance+using+EEG+and+forehead+EOG&rft.jtitle=Journal+of+neural+engineering&rft.au=Zheng%2C+Wei-Long&rft.au=Lu%2C+Bao-Liang&rft.date=2017-04-01&rft.pub=IOP+Publishing&rft.issn=1741-2560&rft.eissn=1741-2552&rft.volume=14&rft.issue=2&rft_id=info:doi/10.1088%2F1741-2552%2Faa5a98&rft.externalDocID=jneaa5a98 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1741-2560&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1741-2560&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1741-2560&client=summon |